[2510.08023] Do We Really Need Permutations? Impact of Model Width on Linear Mode Connectivity
About this article
Abstract page for arXiv paper 2510.08023: Do We Really Need Permutations? Impact of Model Width on Linear Mode Connectivity
Computer Science > Machine Learning arXiv:2510.08023 (cs) [Submitted on 9 Oct 2025 (v1), last revised 5 Mar 2026 (this version, v2)] Title:Do We Really Need Permutations? Impact of Model Width on Linear Mode Connectivity Authors:Akira Ito, Masanori Yamada, Daiki Chijiwa, Atsutoshi Kumagai View a PDF of the paper titled Do We Really Need Permutations? Impact of Model Width on Linear Mode Connectivity, by Akira Ito and 3 other authors View PDF HTML (experimental) Abstract:Recently, Ainsworth et al. empirically demonstrated that, given two independently trained models, applying a parameter permutation that preserves the input-output behavior allows the two models to be connected by a low-loss linear path. When such a path exists, the models are said to achieve linear mode connectivity (LMC). Prior studies, including Ainsworth et al.(2023), have reported that achieving LMC requires not only an appropriate permutation search but also sufficiently wide models (e.g., a 32 $\times$ width multiplier for ResNet-20). This is broadly believed to be because increasing the model width ensures a large enough space of candidate permutations, increasing the chance of finding one that yields LMC. In this work, we empirically demonstrate that, even without any permutations, simply widening the models is sufficient for achieving LMC when using a suitable softmax temperature calibration. We further explain why this phenomenon arises by analyzing intermediate layer outputs. Specifically, we int...