[2602.15923] A fully differentiable framework for training proxy Exchange Correlation Functionals for periodic systems

[2602.15923] A fully differentiable framework for training proxy Exchange Correlation Functionals for periodic systems

arXiv - Machine Learning 4 min read Article

Summary

This paper presents a fully differentiable framework for integrating machine learning models into Density Functional Theory (DFT) for periodic systems, enhancing computational efficiency and accuracy.

Why It Matters

The integration of machine learning with DFT addresses the computational challenges faced in materials science, potentially accelerating research and development in this field. This framework allows for more efficient simulations, which can lead to advancements in material design and discovery.

Key Takeaways

  • Introduces a differentiable framework for proxy exchange-correlation functionals in DFT.
  • Enables gradients to flow through the entire DFT workflow, enhancing model training.
  • Achieves relative errors of 5-10% compared to established electronic structure packages.
  • Implemented in Python with a PyTorch backend for ease of use.
  • Promotes integration with existing libraries like DeepChem for broader accessibility.

Condensed Matter > Materials Science arXiv:2602.15923 (cond-mat) [Submitted on 17 Feb 2026] Title:A fully differentiable framework for training proxy Exchange Correlation Functionals for periodic systems Authors:Rakshit Kumar Singh, Aryan Amit Barsainyan, Bharath Ramsundar View a PDF of the paper titled A fully differentiable framework for training proxy Exchange Correlation Functionals for periodic systems, by Rakshit Kumar Singh and 1 other authors View PDF HTML (experimental) Abstract:Density Functional Theory (DFT) is widely used for first-principles simulations in chemistry and materials science, but its computational cost remains a key limitation for large systems. Motivated by recent advances in ML-based exchange-correlation (XC) functionals, this paper introduces a differentiable framework that integrates machine learning models into density functional theory (DFT) for solids and other periodic systems. The framework defines a clean API for neural network models that can act as drop in replacements for conventional exchange-correlation (XC) functionals and enables gradients to flow through the full self-consistent DFT workflow. The framework is implemented in Python using a PyTorch backend, making it fully differentiable and easy to use with standard deep learning tools. We integrate the implementation with the DeepChem library to promote the reuse of established models and to lower the barrier for experimentation. In initial benchmarks against established electron...

Related Articles

Machine Learning

ICML 2026 am I cooked? [D]

Hi, I am currently making the jump to ML from theoretical physics. I just got done with the review period, went from 4333 to 4433, but th...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] Dealing with an unprofessional reviewer using fake references and personal attacks in ICML26

We are currently facing an ICML 2026 reviewer who lowered the score to a 1 (Confidence 5) while ignoring our rebuttal and relying on fake...

Reddit - Machine Learning · 1 min ·
Open Source Ai

Hugging Face contributes Safetensors to PyTorch Foundation to secure AI model execution

submitted by /u/Fcking_Chuck [link] [comments]

Reddit - Artificial Intelligence · 1 min ·
Llms

[R] The Lyra Technique — A framework for interpreting internal cognitive states in LLMs (Zenodo, open access)

We're releasing a paper on a new framework for reading and interpreting the internal cognitive states of large language models: "The Lyra...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime