[2602.18384] FedZMG: Efficient Client-Side Optimization in Federated Learning
Summary
The paper presents FedZMG, a novel client-side optimization algorithm for Federated Learning that addresses client-drift issues without increasing computational complexity or communication overhead.
Why It Matters
As Federated Learning becomes increasingly important for privacy-preserving AI, optimizing client-side performance is crucial. FedZMG offers a solution to improve convergence speed and model accuracy in non-IID data scenarios, making it relevant for developers and researchers in machine learning and AI.
Key Takeaways
- FedZMG is a parameter-free optimization algorithm designed to reduce client-drift in Federated Learning.
- The method projects local gradients onto a zero-mean hyperplane, improving convergence without additional communication overhead.
- Empirical evaluations show that FedZMG outperforms standard FedAvg and adaptive optimizers like FedAdam in non-IID settings.
- Theoretical analysis confirms that FedZMG guarantees tighter convergence bounds.
- This advancement is particularly beneficial for resource-constrained IoT environments.
Computer Science > Machine Learning arXiv:2602.18384 (cs) [Submitted on 20 Feb 2026] Title:FedZMG: Efficient Client-Side Optimization in Federated Learning Authors:Fotios Zantalis, Evangelos Zervas, Grigorios Koulouras View a PDF of the paper titled FedZMG: Efficient Client-Side Optimization in Federated Learning, by Fotios Zantalis and 2 other authors View PDF HTML (experimental) Abstract:Federated Learning (FL) enables distributed model training on edge devices while preserving data privacy. However, clients tend to have non-Independent and Identically Distributed (non-IID) data, which often leads to client-drift, and therefore diminishing convergence speed and model performance. While adaptive optimizers have been proposed to mitigate these effects, they frequently introduce computational complexity or communication overhead unsuitable for resource-constrained IoT environments. This paper introduces Federated Zero Mean Gradients (FedZMG), a novel, parameter-free, client-side optimization algorithm designed to tackle client-drift by structurally regularizing the optimization space. Advancing the idea of Gradient Centralization, FedZMG projects local gradients onto a zero-mean hyperplane, effectively neutralizing the "intensity" or "bias" shifts inherent in heterogeneous data distributions without requiring additional communication or hyperparameter tuning. A theoretical analysis is provided, proving that FedZMG reduces the effective gradient variance and guarantees tight...