[2602.17700] MIDAS: Mosaic Input-Specific Differentiable Architecture Search

[2602.17700] MIDAS: Mosaic Input-Specific Differentiable Architecture Search

arXiv - Machine Learning 3 min read Article

Summary

MIDAS introduces a novel approach to differentiable neural architecture search by utilizing input-specific parameters and self-attention mechanisms to enhance model robustness and performance.

Why It Matters

This research addresses the limitations of existing neural architecture search methods by proposing a more dynamic and efficient framework. MIDAS's ability to optimize architectures for specific inputs could lead to significant advancements in machine learning applications, particularly in areas requiring high accuracy and adaptability.

Key Takeaways

  • MIDAS modernizes DARTS by integrating dynamic, input-specific parameters.
  • The method achieves state-of-the-art results on multiple benchmark datasets.
  • Patchwise attention enhances the selection process of candidate operations.
  • MIDAS simplifies architecture selection through a parameter-free search space.
  • The approach demonstrates improved robustness and class-aware parameter distributions.

Computer Science > Machine Learning arXiv:2602.17700 (cs) [Submitted on 6 Feb 2026] Title:MIDAS: Mosaic Input-Specific Differentiable Architecture Search Authors:Konstanty Subbotko View a PDF of the paper titled MIDAS: Mosaic Input-Specific Differentiable Architecture Search, by Konstanty Subbotko View PDF HTML (experimental) Abstract:Differentiable Neural Architecture Search (NAS) provides efficient, gradient-based methods for automatically designing neural networks, yet its adoption remains limited in practice. We present MIDAS, a novel approach that modernizes DARTS by replacing static architecture parameters with dynamic, input-specific parameters computed via self-attention. To improve robustness, MIDAS (i) localizes the architecture selection by computing it separately for each spatial patch of the activation map, and (ii) introduces a parameter-free, topology-aware search space that models node connectivity and simplifies selecting the two incoming edges per node. We evaluate MIDAS on the DARTS, NAS-Bench-201, and RDARTS search spaces. In DARTS, it reaches 97.42% top-1 on CIFAR-10 and 83.38% on CIFAR-100. In NAS-Bench-201, it consistently finds globally optimal architectures. In RDARTS, it sets the state of the art on two of four search spaces on CIFAR-10. We further analyze why MIDAS works, showing that patchwise attention improves discrimination among candidate operations, and the resulting input-specific parameter distributions are class-aware and predominantly u...

Related Articles

UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
Alabama A&M University chosen for Amazon Web Services AI training program
Machine Learning

Alabama A&M University chosen for Amazon Web Services AI training program

Alabama A&M University has been selected as one of just five institutions nationwide to participate in Amazon Web Services' Machine Learn...

AI News - General · 2 min ·
Interpretable machine learning model advances analysis of complex genetic traits
Machine Learning

Interpretable machine learning model advances analysis of complex genetic traits

A new study published in Genome Research presents an interpretable artificial intelligence framework that improves both the accuracy and ...

AI News - General · 6 min ·
Sam Altman's Coworkers Say He Can Barely Code and Misunderstands Basic Machine Learning Concepts
Machine Learning

Sam Altman's Coworkers Say He Can Barely Code and Misunderstands Basic Machine Learning Concepts

The OpenAI CEO reportedly confuses basic coding and machine learning terms, numerous insiders have admitted.

AI News - General · 2 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime