[2410.04264] Decoupling Dynamical Richness from Representation Learning: Towards Practical Measurement
About this article
Abstract page for arXiv paper 2410.04264: Decoupling Dynamical Richness from Representation Learning: Towards Practical Measurement
Statistics > Machine Learning arXiv:2410.04264 (stat) [Submitted on 5 Oct 2024 (v1), last revised 2 Mar 2026 (this version, v3)] Title:Decoupling Dynamical Richness from Representation Learning: Towards Practical Measurement Authors:Yoonsoo Nam, Nayara Fonseca, Seok Hyeong Lee, Chris Mingard, Niclas Goring, Ouns El Harzli, Abdurrahman Hadi Erturk, Soufiane Hayou, Ard A. Louis View a PDF of the paper titled Decoupling Dynamical Richness from Representation Learning: Towards Practical Measurement, by Yoonsoo Nam and 8 other authors View PDF HTML (experimental) Abstract:Dynamic feature transformation (the rich regime) does not always align with predictive performance (better representation), yet accuracy is often used as a proxy for richness, limiting analysis of their relationship. We propose a computationally efficient, performance-independent metric of richness grounded in the low-rank bias of rich dynamics, which recovers neural collapse as a special case. The metric is empirically more stable than existing alternatives and captures known lazy-torich transitions (e.g., grokking) without relying on accuracy. We further use it to examine how training factors (e.g., learning rate) relate to richness, confirming recognized assumptions and highlighting new observations (e.g., batch normalization promotes rich dynamics). An eigendecomposition-based visualization is also introduced to support interpretability, together providing a diagnostic tool for studying the relationship be...