[2603.02010] Noise-Calibrated Inference from Differentially Private Sufficient Statistics in Exponential Families
About this article
Abstract page for arXiv paper 2603.02010: Noise-Calibrated Inference from Differentially Private Sufficient Statistics in Exponential Families
Computer Science > Machine Learning arXiv:2603.02010 (cs) [Submitted on 2 Mar 2026] Title:Noise-Calibrated Inference from Differentially Private Sufficient Statistics in Exponential Families Authors:Amir Asiaee, Samhita Pal View a PDF of the paper titled Noise-Calibrated Inference from Differentially Private Sufficient Statistics in Exponential Families, by Amir Asiaee and 1 other authors View PDF HTML (experimental) Abstract:Many differentially private (DP) data release systems either output DP synthetic data and leave analysts to perform inference as usual, which can lead to severe miscalibration, or output a DP point estimate without a principled way to do uncertainty quantification. This paper develops a clean and tractable middle ground for exponential families: release only DP sufficient statistics, then perform noise-calibrated likelihood-based inference and optional parametric synthetic data generation as post-processing. Our contributions are: (1) a general recipe for approximate-DP release of clipped sufficient statistics under the Gaussian mechanism; (2) asymptotic normality, explicit variance inflation, and valid Wald-style confidence intervals for the plug-in DP MLE; (3) a noise-aware likelihood correction that is first-order equivalent to the plug-in but supports bootstrap-based intervals; and (4) a matching minimax lower bound showing the privacy distortion rate is unavoidable. The resulting theory yields concrete design rules and a practical pipeline for re...