[2510.08023] Do We Really Need Permutations? Impact of Model Width on Linear Mode Connectivity

[2510.08023] Do We Really Need Permutations? Impact of Model Width on Linear Mode Connectivity

arXiv - Machine Learning 4 min read

About this article

Abstract page for arXiv paper 2510.08023: Do We Really Need Permutations? Impact of Model Width on Linear Mode Connectivity

Computer Science > Machine Learning arXiv:2510.08023 (cs) [Submitted on 9 Oct 2025 (v1), last revised 5 Mar 2026 (this version, v2)] Title:Do We Really Need Permutations? Impact of Model Width on Linear Mode Connectivity Authors:Akira Ito, Masanori Yamada, Daiki Chijiwa, Atsutoshi Kumagai View a PDF of the paper titled Do We Really Need Permutations? Impact of Model Width on Linear Mode Connectivity, by Akira Ito and 3 other authors View PDF HTML (experimental) Abstract:Recently, Ainsworth et al. empirically demonstrated that, given two independently trained models, applying a parameter permutation that preserves the input-output behavior allows the two models to be connected by a low-loss linear path. When such a path exists, the models are said to achieve linear mode connectivity (LMC). Prior studies, including Ainsworth et al.(2023), have reported that achieving LMC requires not only an appropriate permutation search but also sufficiently wide models (e.g., a 32 $\times$ width multiplier for ResNet-20). This is broadly believed to be because increasing the model width ensures a large enough space of candidate permutations, increasing the chance of finding one that yields LMC. In this work, we empirically demonstrate that, even without any permutations, simply widening the models is sufficient for achieving LMC when using a suitable softmax temperature calibration. We further explain why this phenomenon arises by analyzing intermediate layer outputs. Specifically, we int...

Originally published on March 06, 2026. Curated by AI News.

Related Articles

Machine Learning

[P] Unix philosophy for ML pipelines: modular, swappable stages with typed contracts

We built an open-source prototype that applies Unix philosophy to retrieval pipelines. Each stage (PII redaction, chunking, dedup, embedd...

Reddit - Machine Learning · 1 min ·
Machine Learning

Making an AI native sovereign computational stack

I’ve been working on a personal project that ended up becoming a kind of full computing stack: identity / trust protocol decentralized ch...

Reddit - Artificial Intelligence · 1 min ·
Llms

An attack class that passes every current LLM filter - no payload, no injection signature, no log trace

https://shapingrooms.com/research I published a paper today on something I've been calling postural manipulation. The short version: ordi...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

What tools are sr MLEs using? (clawdbot, openspec, wispr) [D]

I'm already blasting cursor, but I want to level up my output. I heard that these kind of AI tools and workflows are being asked in SF. W...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime