[2602.19033] A Markovian View of Iterative-Feedback Loops in Image Generative Models: Neural Resonance and Model Collapse
Summary
This paper explores iterative feedback loops in image generative models, introducing the concept of neural resonance and its implications for model collapse.
Why It Matters
Understanding the dynamics of feedback loops in AI models is crucial as it can lead to significant improvements in generative model stability. This research provides insights into mitigating model collapse, which is vital for advancing AI technologies in creative fields and beyond.
Key Takeaways
- Iterative feedback in AI can lead to model collapse, a poorly understood phenomenon.
- Neural resonance is identified as a key mechanism in feedback processes affecting model stability.
- The study introduces a taxonomy of collapse behaviors to better characterize these issues.
- Conditions for neural resonance include ergodicity and directional contraction in latent space.
- Practical diagnostics for identifying and mitigating collapse are proposed.
Computer Science > Machine Learning arXiv:2602.19033 (cs) [Submitted on 22 Feb 2026] Title:A Markovian View of Iterative-Feedback Loops in Image Generative Models: Neural Resonance and Model Collapse Authors:Vibhas Kumar Vats, David J. Crandall, Samuel Goree View a PDF of the paper titled A Markovian View of Iterative-Feedback Loops in Image Generative Models: Neural Resonance and Model Collapse, by Vibhas Kumar Vats and 2 other authors View PDF HTML (experimental) Abstract:AI training datasets will inevitably contain AI-generated examples, leading to ``feedback'' in which the output of one model impacts the training of another. It is known that such iterative feedback can lead to model collapse, yet the mechanisms underlying this degeneration remain poorly understood. Here we show that a broad class of feedback processes converges to a low-dimensional invariant structure in latent space, a phenomenon we call neural resonance. By modeling iterative feedback as a Markov Chain, we show that two conditions are needed for this resonance to occur: ergodicity of the feedback process and directional contraction of the latent representation. By studying diffusion models on MNIST and ImageNet, as well as CycleGAN and an audio feedback experiment, we map how local and global manifold geometry evolve, and we introduce an eight-pattern taxonomy of collapse behaviors. Neural resonance provides a unified explanation for long-term degenerate behavior in generative models and provides pra...