[2603.20587] Neural collapse in the orthoplex regime
About this article
Abstract page for arXiv paper 2603.20587: Neural collapse in the orthoplex regime
Computer Science > Machine Learning arXiv:2603.20587 (cs) [Submitted on 21 Mar 2026] Title:Neural collapse in the orthoplex regime Authors:James Alcala, Rayna Andreeva, Vladimir A. Kobzar, Dustin G. Mixon, Sanghoon Na, Shashank Sule, Yangxinyu Xie View a PDF of the paper titled Neural collapse in the orthoplex regime, by James Alcala and 6 other authors View PDF HTML (experimental) Abstract:When training a neural network for classification, the feature vectors of the training set are known to collapse to the vertices of a regular simplex, provided the dimension $d$ of the feature space and the number $n$ of classes satisfies $n\leq d+1$. This phenomenon is known as neural collapse. For other applications like language models, one instead takes $n\gg d$. Here, the neural collapse phenomenon still occurs, but with different emergent geometric figures. We characterize these geometric figures in the orthoplex regime where $d+2\leq n\leq 2d$. The techniques in our analysis primarily involve Radon's theorem and convexity. Subjects: Machine Learning (cs.LG); Information Theory (cs.IT); Metric Geometry (math.MG) Cite as: arXiv:2603.20587 [cs.LG] (or arXiv:2603.20587v1 [cs.LG] for this version) https://doi.org/10.48550/arXiv.2603.20587 Focus to learn more arXiv-issued DOI via DataCite (pending registration) Submission history From: Dustin Mixon [view email] [v1] Sat, 21 Mar 2026 01:04:26 UTC (13 KB) Full-text links: Access Paper: View a PDF of the paper titled Neural collapse i...