[2503.10404] Architecture-Aware Minimization (A$^2$M): How to Find Flat Minima in Neural Architecture Search
About this article
Abstract page for arXiv paper 2503.10404: Architecture-Aware Minimization (A$^2$M): How to Find Flat Minima in Neural Architecture Search
Computer Science > Machine Learning arXiv:2503.10404 (cs) [Submitted on 13 Mar 2025 (v1), last revised 24 Mar 2026 (this version, v3)] Title:Architecture-Aware Minimization (A$^2$M): How to Find Flat Minima in Neural Architecture Search Authors:Matteo Gambella, Fabrizio Pittorino, Manuel Roveri View a PDF of the paper titled Architecture-Aware Minimization (A$^2$M): How to Find Flat Minima in Neural Architecture Search, by Matteo Gambella and 2 other authors View PDF HTML (experimental) Abstract:Neural Architecture Search (NAS) has become an essential tool for designing effective and efficient neural networks. In this paper, we investigate the geometric properties of neural architecture spaces commonly used in differentiable NAS methods, specifically NAS-Bench-201 and DARTS. By defining flatness metrics such as neighborhoods and loss barriers along paths in architecture space, we reveal locality and flatness characteristics analogous to the well-known properties of neural network loss landscapes in weight space. In particular, we find that highly accurate architectures cluster together in flat regions, while suboptimal architectures remain isolated, unveiling the detailed geometrical structure of the architecture search landscape. Building on these insights, we propose Architecture-Aware Minimization (A$^2$M), a novel analytically derived algorithmic framework that explicitly biases, for the first time, the gradient of differentiable NAS methods towards flat minima in arch...