[2602.06838] An Adaptive Differentially Private Federated Learning Framework with Bi-level Optimization
Summary
This paper presents an adaptive differentially private federated learning framework that addresses challenges in model efficiency and stability during training across heterogeneous data environments.
Why It Matters
The research is significant as it tackles the critical issues of data privacy and model performance in federated learning, particularly in real-world applications where data is often non-IID and device capabilities vary. By enhancing stability and accuracy, this framework could improve the deployment of federated learning in sensitive applications, such as healthcare and finance.
Key Takeaways
- Introduces a framework to enhance federated learning efficiency under privacy constraints.
- Utilizes adaptive gradient clipping to improve model stability.
- Implements a lightweight local compression module to mitigate noise amplification.
- Demonstrates improved convergence stability and accuracy on CIFAR-10 and SVHN datasets.
- Addresses challenges posed by heterogeneous data and device variability.
Computer Science > Artificial Intelligence arXiv:2602.06838 (cs) This paper has been withdrawn by Hui Ma [Submitted on 6 Feb 2026 (v1), last revised 19 Feb 2026 (this version, v2)] Title:An Adaptive Differentially Private Federated Learning Framework with Bi-level Optimization Authors:Jin Wang, Hui Ma, Fei Xing, Ming Yan View a PDF of the paper titled An Adaptive Differentially Private Federated Learning Framework with Bi-level Optimization, by Jin Wang and 3 other authors No PDF available, click to view other formats Abstract:Federated learning enables collaborative model training across distributed clients while preserving data privacy. However, in practical deployments, device heterogeneity, non-independent, and identically distributed (Non-IID) data often lead to highly unstable and biased gradient updates. When differential privacy is enforced, conventional fixed gradient clipping and Gaussian noise injection may further amplify gradient perturbations, resulting in training oscillation and performance degradation and degraded model performance. To address these challenges, we propose an adaptive differentially private federated learning framework that explicitly targets model efficiency under heterogeneous and privacy-constrained settings. On the client side, a lightweight local compressed module is introduced to regularize intermediate representations and constrain gradient variability, thereby mitigating noise amplification during local optimization. On the server s...