[2603.23823] Circuit Complexity of Hierarchical Knowledge Tracing and Implications for Log-Precision Transformers

[2603.23823] Circuit Complexity of Hierarchical Knowledge Tracing and Implications for Log-Precision Transformers

arXiv - Machine Learning 3 min read

About this article

Abstract page for arXiv paper 2603.23823: Circuit Complexity of Hierarchical Knowledge Tracing and Implications for Log-Precision Transformers

Computer Science > Machine Learning arXiv:2603.23823 (cs) [Submitted on 25 Mar 2026] Title:Circuit Complexity of Hierarchical Knowledge Tracing and Implications for Log-Precision Transformers Authors:Naiming Liu, Richard Baraniuk, Shashank Sonkar View a PDF of the paper titled Circuit Complexity of Hierarchical Knowledge Tracing and Implications for Log-Precision Transformers, by Naiming Liu and 1 other authors View PDF HTML (experimental) Abstract:Knowledge tracing models mastery over interconnected concepts, often organized by prerequisites. We analyze hierarchical prerequisite propagation through a circuit-complexity lens to clarify what is provable about transformer-style computation on deep concept hierarchies. Using recent results that log-precision transformers lie in logspace-uniform $\mathsf{TC}^0$, we formalize prerequisite-tree tasks including recursive-majority mastery propagation. Unconditionally, recursive-majority propagation lies in $\mathsf{NC}^1$ via $O(\log n)$-depth bounded-fanin circuits, while separating it from uniform $\mathsf{TC}^0$ would require major progress on open lower bounds. Under a monotonicity restriction, we obtain an unconditional barrier: alternating ALL/ANY prerequisite trees yield a strict depth hierarchy for \emph{monotone} threshold circuits. Empirically, transformer encoders trained on recursive-majority trees converge to permutation-invariant shortcuts; explicit structure alone does not prevent this, but auxiliary supervision on ...

Originally published on March 26, 2026. Curated by AI News.

Related Articles

Machine Learning

Ml project user give dataset and I give best model [D] [P]

Tl,dr : suggest me a solution to create a ai ml project where user will give his dataset as input and the project should give best model ...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] ICML Reviewer Acknowledgement

Hi, I'm a little confused about ICML discussion period Does the period for reviewer acknowledging responses have already ended? One of th...

Reddit - Machine Learning · 1 min ·
Llms

Claude Opus 4.6 API at 40% below Anthropic pricing – try free before you pay anything

Hey everyone I've set up a self-hosted API gateway using [New-API](QuantumNous/new-ap) to manage and distribute Claude Opus 4.6 access ac...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

[D] ICML reviewer making up false claim in acknowledgement, what to do?

In a rebuttal acknowledgement we received, the reviewer made up a claim that our method performs worse than baselines with some hyperpara...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime