[2602.13619] Locally Private Parametric Methods for Change-Point Detection
Summary
This paper presents novel locally private parametric methods for change-point detection, focusing on maintaining privacy while identifying distributional changes in time series data.
Why It Matters
As data privacy concerns grow, this research addresses the challenge of change-point detection under local differential privacy. It provides insights into the trade-offs between privacy and statistical performance, which is crucial for applications in sensitive data environments.
Key Takeaways
- Introduces locally differentially private algorithms for change-point detection.
- Demonstrates improved finite-sample accuracy guarantees using martingale methods.
- Establishes bounds on detection accuracy in private settings.
- Analyzes the statistical cost of local differential privacy.
- Proves structural results for strong data processing inequalities relevant to various applications.
Statistics > Machine Learning arXiv:2602.13619 (stat) [Submitted on 14 Feb 2026] Title:Locally Private Parametric Methods for Change-Point Detection Authors:Anuj Kumar Yadav, Cemre Cadir, Yanina Shkel, Michael Gastpar View a PDF of the paper titled Locally Private Parametric Methods for Change-Point Detection, by Anuj Kumar Yadav and 3 other authors View PDF Abstract:We study parametric change-point detection, where the goal is to identify distributional changes in time series, under local differential privacy. In the non-private setting, we derive improved finite-sample accuracy guarantees for a change-point detection algorithm based on the generalized log-likelihood ratio test, via martingale methods. In the private setting, we propose two locally differentially private algorithms based on randomized response and binary mechanisms, and analyze their theoretical performance. We derive bounds on detection accuracy and validate our results through empirical evaluation. Our results characterize the statistical cost of local differential privacy in change-point detection and show how privacy degrades performance relative to a non-private benchmark. As part of this analysis, we establish a structural result for strong data processing inequalities (SDPI), proving that SDPI coefficients for Rényi divergences and their symmetric variants (Jeffreys-Rényi divergences) are achieved by binary input distributions. These results on SDPI coefficients are also of independent interest, wi...