[2603.20014] AgenticRS-EnsNAS: Ensemble-Decoupled Self-Evolving Architecture Search
About this article
Abstract page for arXiv paper 2603.20014: AgenticRS-EnsNAS: Ensemble-Decoupled Self-Evolving Architecture Search
Computer Science > Machine Learning arXiv:2603.20014 (cs) [Submitted on 20 Mar 2026] Title:AgenticRS-EnsNAS: Ensemble-Decoupled Self-Evolving Architecture Search Authors:Yun Chen, Moyu Zhang, Jinxin Hu, Yu Zhang, Xiaoyi Zeng View a PDF of the paper titled AgenticRS-EnsNAS: Ensemble-Decoupled Self-Evolving Architecture Search, by Yun Chen and 4 other authors View PDF HTML (experimental) Abstract:Neural Architecture Search (NAS) deployment in industrial production systems faces a fundamental validation bottleneck: verifying a single candidate architecture pi requires evaluating the deployed ensemble of M models, incurring prohibitive O(M) computational cost per candidate. This cost barrier severely limits architecture iteration frequency in real-world applications where ensembles (M=50-200) are standard for robustness. This work introduces Ensemble-Decoupled Architecture Search, a framework that leverages ensemble theory to predict system-level performance from single-learner evaluation. We establish the Ensemble-Decoupled Theory with a sufficient condition for monotonic ensemble improvement under homogeneity assumptions: a candidate architecture pi yields lower ensemble error than the current baseline if rho(pi) < rho(pi_old) - (M / (M - 1)) * (Delta E(pi) / sigma^2(pi)), where Delta E, rho, and sigma^2 are estimable from lightweight dual-learner training. This decouples architecture search from full ensemble training, reducing per-candidate search cost from O(M) to O(1) wh...