[2602.00191] GEPC: Group-Equivariant Posterior Consistency for Out-of-Distribution Detection in Diffusion Models

[2602.00191] GEPC: Group-Equivariant Posterior Consistency for Out-of-Distribution Detection in Diffusion Models

arXiv - Machine Learning 4 min read Article

Summary

The paper introduces Group-Equivariant Posterior Consistency (GEPC), a method for detecting out-of-distribution data in diffusion models by measuring equivariance breaking, achieving competitive results on benchmark datasets.

Why It Matters

As machine learning models increasingly handle diverse data, ensuring their robustness against out-of-distribution inputs is critical. GEPC provides a novel approach that enhances detection capabilities while maintaining computational efficiency, which is essential for real-world applications in AI safety and reliability.

Key Takeaways

  • GEPC measures how consistently learned scores transform under group actions.
  • It detects equivariance breaking even when score magnitudes are unchanged.
  • The method is computationally lightweight and interpretable.
  • GEPC shows competitive performance on OOD detection benchmarks.
  • It provides strong target-background separation in complex imagery.

Computer Science > Machine Learning arXiv:2602.00191 (cs) [Submitted on 30 Jan 2026 (v1), last revised 18 Feb 2026 (this version, v2)] Title:GEPC: Group-Equivariant Posterior Consistency for Out-of-Distribution Detection in Diffusion Models Authors:Yadang Alexis Rouzoumka, Jean Pinsolle, Eugénie Terreaux, Christèle Morisseau, Jean-Philippe Ovarlez, Chengfang Ren View a PDF of the paper titled GEPC: Group-Equivariant Posterior Consistency for Out-of-Distribution Detection in Diffusion Models, by Yadang Alexis Rouzoumka and Jean Pinsolle and Eug\'enie Terreaux and Christ\`ele Morisseau and Jean-Philippe Ovarlez and Chengfang Ren View PDF HTML (experimental) Abstract:Diffusion models learn a time-indexed score field $\mathbf{s}_\theta(\mathbf{x}_t,t)$ that often inherits approximate equivariances (flips, rotations, circular shifts) from in-distribution (ID) data and convolutional backbones. Most diffusion-based out-of-distribution (OOD) detectors exploit score magnitude or local geometry (energies, curvature, covariance spectra) and largely ignore equivariances. We introduce Group-Equivariant Posterior Consistency (GEPC), a training-free probe that measures how consistently the learned score transforms under a finite group $\mathcal{G}$, detecting equivariance breaking even when score magnitude remains unchanged. At the population level, we propose the ideal GEPC residual, which averages an equivariance-residual functional over $\mathcal{G}$, and we derive ID upper bounds and...

Related Articles

Hub Group Using AI, Machine Learning for Real-Time Visibility of Shipments
Machine Learning

Hub Group Using AI, Machine Learning for Real-Time Visibility of Shipments

AI Events · 4 min ·
Llms

Von Hammerstein’s Ghost: What a Prussian General’s Officer Typology Can Teach Us About AI Misalignment

Greetings all - I've posted mostly in r/claudecode and r/aigamedev a couple of times previously. Working with CC for personal projects re...

Reddit - Artificial Intelligence · 1 min ·
Llms

World models will be the next big thing, bye-bye LLMs

Was at Nvidia's GTC conference recently and honestly, it was one of the most eye-opening events I've attended in a while. There was a lot...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

[D] Got my first offer after months of searching — below posted range, contract-to-hire, and worried it may pause my search. Do I take it?

I could really use some outside perspective. I’m a senior ML/CV engineer in Canada with about 5–6 years across research and industry. Mas...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime