[2602.00191] GEPC: Group-Equivariant Posterior Consistency for Out-of-Distribution Detection in Diffusion Models
Summary
The paper introduces Group-Equivariant Posterior Consistency (GEPC), a method for detecting out-of-distribution data in diffusion models by measuring equivariance breaking, achieving competitive results on benchmark datasets.
Why It Matters
As machine learning models increasingly handle diverse data, ensuring their robustness against out-of-distribution inputs is critical. GEPC provides a novel approach that enhances detection capabilities while maintaining computational efficiency, which is essential for real-world applications in AI safety and reliability.
Key Takeaways
- GEPC measures how consistently learned scores transform under group actions.
- It detects equivariance breaking even when score magnitudes are unchanged.
- The method is computationally lightweight and interpretable.
- GEPC shows competitive performance on OOD detection benchmarks.
- It provides strong target-background separation in complex imagery.
Computer Science > Machine Learning arXiv:2602.00191 (cs) [Submitted on 30 Jan 2026 (v1), last revised 18 Feb 2026 (this version, v2)] Title:GEPC: Group-Equivariant Posterior Consistency for Out-of-Distribution Detection in Diffusion Models Authors:Yadang Alexis Rouzoumka, Jean Pinsolle, Eugénie Terreaux, Christèle Morisseau, Jean-Philippe Ovarlez, Chengfang Ren View a PDF of the paper titled GEPC: Group-Equivariant Posterior Consistency for Out-of-Distribution Detection in Diffusion Models, by Yadang Alexis Rouzoumka and Jean Pinsolle and Eug\'enie Terreaux and Christ\`ele Morisseau and Jean-Philippe Ovarlez and Chengfang Ren View PDF HTML (experimental) Abstract:Diffusion models learn a time-indexed score field $\mathbf{s}_\theta(\mathbf{x}_t,t)$ that often inherits approximate equivariances (flips, rotations, circular shifts) from in-distribution (ID) data and convolutional backbones. Most diffusion-based out-of-distribution (OOD) detectors exploit score magnitude or local geometry (energies, curvature, covariance spectra) and largely ignore equivariances. We introduce Group-Equivariant Posterior Consistency (GEPC), a training-free probe that measures how consistently the learned score transforms under a finite group $\mathcal{G}$, detecting equivariance breaking even when score magnitude remains unchanged. At the population level, we propose the ideal GEPC residual, which averages an equivariance-residual functional over $\mathcal{G}$, and we derive ID upper bounds and...