[2508.17381] DART: A Server-side Plug-in for Resource-efficient Robust Federated Learning
About this article
Abstract page for arXiv paper 2508.17381: DART: A Server-side Plug-in for Resource-efficient Robust Federated Learning
Computer Science > Machine Learning arXiv:2508.17381 (cs) [Submitted on 24 Aug 2025 (v1), last revised 25 Mar 2026 (this version, v2)] Title:DART: A Server-side Plug-in for Resource-efficient Robust Federated Learning Authors:Omar Bekdache, Naresh Shanbhag View a PDF of the paper titled DART: A Server-side Plug-in for Resource-efficient Robust Federated Learning, by Omar Bekdache and Naresh Shanbhag View PDF HTML (experimental) Abstract:Federated learning (FL) emerged as a popular distributed algorithm to train machine learning models on edge devices while preserving data privacy. However, FL systems face challenges due to client-side computational constraints and from a lack of robustness to naturally occurring common corruptions such as noise, blur, and weather effects. Existing robust training methods are computationally expensive and unsuitable for resource-constrained clients. We propose a novel data-agnostic robust training (DART) plug-in that can be deployed in any FL system to enhance robustness at zero client overhead. DART operates at the server-side and does not require private data access, ensuring seamless integration in existing FL systems. Extensive experiments showcase DART's ability to enhance robustness of state-of-the-art FL systems, establishing it as a practical and scalable solution for real-world robust FL deployment. Subjects: Machine Learning (cs.LG) Cite as: arXiv:2508.17381 [cs.LG] (or arXiv:2508.17381v2 [cs.LG] for this version) https://doi....