[2602.13619] Locally Private Parametric Methods for Change-Point Detection

[2602.13619] Locally Private Parametric Methods for Change-Point Detection

arXiv - Machine Learning 3 min read Article

Summary

This paper presents novel locally private parametric methods for change-point detection, focusing on maintaining privacy while identifying distributional changes in time series data.

Why It Matters

As data privacy concerns grow, this research addresses the challenge of change-point detection under local differential privacy. It provides insights into the trade-offs between privacy and statistical performance, which is crucial for applications in sensitive data environments.

Key Takeaways

  • Introduces locally differentially private algorithms for change-point detection.
  • Demonstrates improved finite-sample accuracy guarantees using martingale methods.
  • Establishes bounds on detection accuracy in private settings.
  • Analyzes the statistical cost of local differential privacy.
  • Proves structural results for strong data processing inequalities relevant to various applications.

Statistics > Machine Learning arXiv:2602.13619 (stat) [Submitted on 14 Feb 2026] Title:Locally Private Parametric Methods for Change-Point Detection Authors:Anuj Kumar Yadav, Cemre Cadir, Yanina Shkel, Michael Gastpar View a PDF of the paper titled Locally Private Parametric Methods for Change-Point Detection, by Anuj Kumar Yadav and 3 other authors View PDF Abstract:We study parametric change-point detection, where the goal is to identify distributional changes in time series, under local differential privacy. In the non-private setting, we derive improved finite-sample accuracy guarantees for a change-point detection algorithm based on the generalized log-likelihood ratio test, via martingale methods. In the private setting, we propose two locally differentially private algorithms based on randomized response and binary mechanisms, and analyze their theoretical performance. We derive bounds on detection accuracy and validate our results through empirical evaluation. Our results characterize the statistical cost of local differential privacy in change-point detection and show how privacy degrades performance relative to a non-private benchmark. As part of this analysis, we establish a structural result for strong data processing inequalities (SDPI), proving that SDPI coefficients for Rényi divergences and their symmetric variants (Jeffreys-Rényi divergences) are achieved by binary input distributions. These results on SDPI coefficients are also of independent interest, wi...

Related Articles

Ai Startups

This AI startup envisions 100 Million New People Making Videogames

submitted by /u/sharkymcstevenson2 [link] [comments]

Reddit - Artificial Intelligence · 1 min ·
Llms

A robot car with a Claude AI brain started a YouTube vlog about its own existence

Not a demo reel. Not a tutorial. A robot narrating its own experience — debugging, falling off shelves, questioning its identity. First-p...

Reddit - Artificial Intelligence · 1 min ·
Anthropic ramps up its political activities with a new PAC | TechCrunch
Ai Startups

Anthropic ramps up its political activities with a new PAC | TechCrunch

With the midterms right around the corner, the new group is positioned to back candidates who support the AI company's policy agenda.

TechCrunch - AI · 3 min ·
Anthropic buys biotech startup Coefficient Bio in $400M deal: Reports | TechCrunch
Ai Startups

Anthropic buys biotech startup Coefficient Bio in $400M deal: Reports | TechCrunch

Anthropic has purchased the stealth biotech AI startup Coefficient Bio in a $400 million stock deal, according to The Information and Eri...

TechCrunch - AI · 3 min ·
More in Ai Startups: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime