[2602.15022] Rethinking Diffusion Models with Symmetries through Canonicalization with Applications to Molecular Graph Generation

[2602.15022] Rethinking Diffusion Models with Symmetries through Canonicalization with Applications to Molecular Graph Generation

arXiv - AI 4 min read Article

Summary

This paper explores a novel approach to diffusion models by emphasizing canonicalization to enhance molecular graph generation, demonstrating superior performance over traditional methods.

Why It Matters

The research addresses a significant gap in generative modeling by challenging existing architectural constraints. By introducing canonicalization, it offers a more efficient and effective method for generating molecular graphs, which is crucial for advancements in chemistry and materials science.

Key Takeaways

  • Canonicalization improves training efficiency in diffusion models.
  • The proposed method outperforms equivariant baselines in 3D molecular generation tasks.
  • Aligned priors and optimal transport enhance the canonicalization framework.

Computer Science > Machine Learning arXiv:2602.15022 (cs) [Submitted on 16 Feb 2026] Title:Rethinking Diffusion Models with Symmetries through Canonicalization with Applications to Molecular Graph Generation Authors:Cai Zhou, Zijie Chen, Zian Li, Jike Wang, Kaiyi Jiang, Pan Li, Rose Yu, Muhan Zhang, Stephen Bates, Tommi Jaakkola View a PDF of the paper titled Rethinking Diffusion Models with Symmetries through Canonicalization with Applications to Molecular Graph Generation, by Cai Zhou and 9 other authors View PDF HTML (experimental) Abstract:Many generative tasks in chemistry and science involve distributions invariant to group symmetries (e.g., permutation and rotation). A common strategy enforces invariance and equivariance through architectural constraints such as equivariant denoisers and invariant priors. In this paper, we challenge this tradition through the alternative canonicalization perspective: first map each sample to an orbit representative with a canonical pose or order, train an unconstrained (non-equivariant) diffusion or flow model on the canonical slice, and finally recover the invariant distribution by sampling a random symmetry transform at generation time. Building on a formal quotient-space perspective, our work provides a comprehensive theory of canonical diffusion by proving: (i) the correctness, universality and superior expressivity of canonical generative models over invariant targets; (ii) canonicalization accelerates training by removing diff...

Related Articles

Anthropic Teams Up With Its Rivals to Keep AI From Hacking Everything | WIRED
Llms

Anthropic Teams Up With Its Rivals to Keep AI From Hacking Everything | WIRED

The AI lab's Project Glasswing will bring together Apple, Google, and more than 45 other organizations. They'll use the new Claude Mythos...

Wired - AI · 7 min ·
Machine Learning

[for hire] Open for contracts – Veteran Data Scientist (AI / ML / OR) focused on delivering real‑world solutions.

Hi Reddit, I've spent 20 years working with data, and I've learned how to crack problems that AI systems struggle with. I've got a knack ...

Reddit - ML Jobs · 1 min ·
Llms

The public needs to control AI-run infrastructure, labor, education, and governance— NOT private actors

A lot of discussion around AI is becoming siloed, and I think that is dangerous. People in AI-focused spaces often talk as if the only qu...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

[D] ICML final justification

Do we get notified if any reviewer put their final justification into their original review comment? submitted by /u/tuejan11 [link] [com...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime