[2602.12449] Computationally sufficient statistics for Ising models
Summary
This paper explores computationally sufficient statistics for Ising models, addressing the challenges of learning Gibbs distributions with limited observational data.
Why It Matters
Understanding how to efficiently learn Gibbs distributions with limited data is crucial for applications in statistical mechanics and machine learning. This research provides new methods for parameter reconstruction in Ising models, which can have significant implications in various scientific fields, including physics and data science.
Key Takeaways
- Efficient learning of Gibbs distributions is possible with limited statistics.
- The Ising model serves as a key example for demonstrating these methods.
- The study shows that model parameters can be reconstructed with observations up to an order of O(γ).
- Prior information about model structure enhances learning efficiency.
- This approach has potential applications in both physical systems and machine learning contexts.
Computer Science > Machine Learning arXiv:2602.12449 (cs) [Submitted on 12 Feb 2026] Title:Computationally sufficient statistics for Ising models Authors:Abhijith Jayakumar, Shreya Shukla, Marc Vuffray, Andrey Y. Lokhov, Sidhant Misra View a PDF of the paper titled Computationally sufficient statistics for Ising models, by Abhijith Jayakumar and 4 other authors View PDF HTML (experimental) Abstract:Learning Gibbs distributions using only sufficient statistics has long been recognized as a computationally hard problem. On the other hand, computationally efficient algorithms for learning Gibbs distributions rely on access to full sample configurations generated from the model. For many systems of interest that arise in physical contexts, expecting a full sample to be observed is not practical, and hence it is important to look for computationally efficient methods that solve the learning problem with access to only a limited set of statistics. We examine the trade-offs between the power of computation and observation within this scenario, employing the Ising model as a paradigmatic example. We demonstrate that it is feasible to reconstruct the model parameters for a model with $\ell_1$ width $\gamma$ by observing statistics up to an order of $O(\gamma)$. This approach allows us to infer the model's structure and also learn its couplings and magnetic fields. We also discuss a setting where prior information about structure of the model is available and show that the learning ...