[2509.20993] Learning the Inverse Temperature of Ising Models under Hard Constraints using One Sample

[2509.20993] Learning the Inverse Temperature of Ising Models under Hard Constraints using One Sample

arXiv - Machine Learning 4 min read Article

Summary

This paper presents a method for estimating the inverse temperature parameter of truncated Ising models using a single sample, focusing on the implications of hard constraints in the model.

Why It Matters

Understanding the inverse temperature in Ising models is crucial for various applications in statistical physics and machine learning. This research addresses challenges in estimating parameters under hard constraints, which can enhance the efficiency of algorithms in complex systems.

Key Takeaways

  • Introduces an estimator for the inverse temperature of truncated Ising models using a single sample.
  • Estimation is achieved in nearly O(n) time, making it efficient for large datasets.
  • The method generalizes existing techniques to handle more complex settings involving hard constraints.

Computer Science > Machine Learning arXiv:2509.20993 (cs) [Submitted on 25 Sep 2025 (v1), last revised 14 Feb 2026 (this version, v2)] Title:Learning the Inverse Temperature of Ising Models under Hard Constraints using One Sample Authors:Rohan Chauhan, Ioannis Panageas View a PDF of the paper titled Learning the Inverse Temperature of Ising Models under Hard Constraints using One Sample, by Rohan Chauhan and Ioannis Panageas View PDF HTML (experimental) Abstract:We consider the problem of estimating inverse temperature parameter $\beta$ of an $n$-dimensional truncated Ising model using a single sample. Given a graph $G = (V,E)$ with $n$ vertices, a truncated Ising model is a probability distribution over the $n$-dimensional hypercube $\{-1,1\}^n$ where each configuration $\mathbf{\sigma}$ is constrained to lie in a truncation set $S \subseteq \{-1,1\}^n$ and has probability $\Pr(\mathbf{\sigma}) \propto \exp(\beta\mathbf{\sigma}^\top A\mathbf{\sigma})$ with $A$ being the adjacency matrix of $G$. We adopt the recent setting of [Galanis et al. SODA'24], where the truncation set $S$ can be expressed as the set of satisfying assignments of a $k$-SAT formula. Given a single sample $\mathbf{\sigma}$ from a truncated Ising model, with inverse parameter $\beta^*$, underlying graph $G$ of bounded degree $\Delta$ and $S$ being expressed as the set of satisfying assignments of a $k$-SAT formula, we design in nearly $O(n)$ time an estimator $\hat{\beta}$ that is $O(\Delta^3/\sqrt{n})$...

Related Articles

Machine Learning

Anyone compared Gemma 4 31B

I have been seeing a lot of people claiming how good Gemma 4 31B model is. I know when compared to the size of models like sonnet which i...

Reddit - Artificial Intelligence · 1 min ·
Google’s Gemini AI can answer your questions with 3D models and simulations
Llms

Google’s Gemini AI can answer your questions with 3D models and simulations

Google's latest upgrade for Gemini will allow the chatbot to generate interactive 3D models and simulations in response to your questions...

The Verge - AI · 4 min ·
The fear over Anthropic’s new AI model Mythos
Machine Learning

The fear over Anthropic’s new AI model Mythos

AI Tools & Products · 5 min ·
The Gemini app can now generate interactive simulations and models.
Llms

The Gemini app can now generate interactive simulations and models.

AI Tools & Products · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime