[2603.26554] Sharp Capacity Scaling of Spectral Optimizers in Learning Associative Memory
About this article
Abstract page for arXiv paper 2603.26554: Sharp Capacity Scaling of Spectral Optimizers in Learning Associative Memory
Computer Science > Machine Learning arXiv:2603.26554 (cs) [Submitted on 27 Mar 2026] Title:Sharp Capacity Scaling of Spectral Optimizers in Learning Associative Memory Authors:Juno Kim, Eshaan Nichani, Denny Wu, Alberto Bietti, Jason D. Lee View a PDF of the paper titled Sharp Capacity Scaling of Spectral Optimizers in Learning Associative Memory, by Juno Kim and 4 other authors View PDF HTML (experimental) Abstract:Spectral optimizers such as Muon have recently shown strong empirical performance in large-scale language model training, but the source and extent of their advantage remain poorly understood. We study this question through the linear associative memory problem, a tractable model for factual recall in transformer-based models. In particular, we go beyond orthogonal embeddings and consider Gaussian inputs and outputs, which allows the number of stored associations to greatly exceed the embedding dimension. Our main result sharply characterizes the recovery rates of one step of Muon and SGD on the logistic regression loss under a power law frequency distribution. We show that the storage capacity of Muon significantly exceeds that of SGD, and moreover Muon saturates at a larger critical batch size. We further analyze the multi-step dynamics under a thresholded gradient approximation and show that Muon achieves a substantially faster initial recovery rate than SGD, while both methods eventually converge to the information-theoretic limit at comparable speeds. Expe...