[2602.01434] Phase Transitions for Feature Learning in Neural Networks

[2602.01434] Phase Transitions for Feature Learning in Neural Networks

arXiv - Machine Learning 4 min read Article

Summary

This paper explores phase transitions in neural networks, specifically focusing on feature learning dynamics in two-layer networks and establishing a threshold for effective learning.

Why It Matters

Understanding phase transitions in neural networks is crucial for improving feature learning efficiency. This research provides insights into the dynamics of learning in high-dimensional spaces, which can inform the design of more effective neural network architectures and training algorithms.

Key Takeaways

  • Identifies a threshold for effective feature learning in two-layer neural networks.
  • Establishes a relationship between learning dynamics and network architecture.
  • Explains the role of gradient dynamics in the learning process.
  • Highlights the significance of phase transitions in neural network training.
  • Provides a formal framework for analyzing learning in high-dimensional spaces.

Computer Science > Machine Learning arXiv:2602.01434 (cs) [Submitted on 1 Feb 2026 (v1), last revised 26 Feb 2026 (this version, v2)] Title:Phase Transitions for Feature Learning in Neural Networks Authors:Andrea Montanari, Zihao Wang View a PDF of the paper titled Phase Transitions for Feature Learning in Neural Networks, by Andrea Montanari and Zihao Wang View PDF Abstract:According to a popular viewpoint, neural networks learn from data by first identifying low-dimensional representations, and subsequently fitting the best model in this space. Recent works provide a formalization of this phenomenon when learning multi-index models. In this setting, we are given $n$ i.i.d. pairs $({\boldsymbol x}_i,y_i)$, where the covariate vectors ${\boldsymbol x}_i\in\mathbb{R}^d$ are isotropic, and responses $y_i$ only depend on ${\boldsymbol x}_i$ through a $k$-dimensional projection ${\boldsymbol \Theta}_*^{\sf T}{\boldsymbol x}_i$. Feature learning amounts to learning the latent space spanned by ${\boldsymbol \Theta}_*$. In this context, we study the gradient descent dynamics of two-layer neural networks under the proportional asymptotics $n,d\to\infty$, $n/d\to\delta$, while the dimension of the latent space $k$ and the number of hidden neurons $m$ are kept fixed. Earlier work establishes that feature learning via polynomial-time algorithms is possible if $\delta> \delta_{\text{alg}}$, for $\delta_{\text{alg}}$ a threshold depending on the data distribution, and is impossible (wi...

Related Articles

Machine Learning

[R] Are there ML approaches for prioritizing and routing “important” signals across complex systems?

I’ve been reading more about attention mechanisms in transformers and how they effectively learn to weight and prioritize relevant inputs...

Reddit - Machine Learning · 1 min ·
Llms

[P] I trained a language model from scratch for a low resource language and got it running fully on-device on Android (no GPU, demo)

Hi Everybody! I just wanted to share an update on a project I’ve been working on called BULaMU, a family of language models trained (20M,...

Reddit - Machine Learning · 1 min ·
Machine Learning

[R] Structure Over Scale: Memory-First Reasoning and Depth-Pruned Efficiency in Magnus and Seed Architecture Auto-Discovery

Dataset Model Acc F1 Δ vs Log Δ vs Static Avg Params Peak Params Steps Infer ms Size Banking77-20 Logistic TF-IDF 92.37% 0.9230 +0.00pp +...

Reddit - Machine Learning · 1 min ·
UM Computer Scientists Land Grant to Improve Models of Melting Greenland Glaciers
Machine Learning

UM Computer Scientists Land Grant to Improve Models of Melting Greenland Glaciers

Two UM researchers are using advanced neural networks, machine learning and artificial intelligence to improve climate models to better p...

AI News - General · 5 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime