[2602.15337] FedPSA: Modeling Behavioral Staleness in Asynchronous Federated Learning
Summary
The paper presents FedPSA, a novel framework for Asynchronous Federated Learning that improves performance by dynamically measuring model staleness through parameter sensitivity.
Why It Matters
Asynchronous Federated Learning is crucial for enhancing training speed in machine learning. However, existing methods often overlook the nuances of model staleness, which can hinder performance. FedPSA addresses this gap, offering a more refined approach that could lead to significant advancements in federated learning applications.
Key Takeaways
- FedPSA improves the measurement of model staleness in asynchronous federated learning.
- The framework uses parameter sensitivity to enhance performance metrics.
- Dynamic adjustments to the tolerance for outdated information lead to better training outcomes.
- Experimental results show a performance improvement of up to 6.37% over baseline methods.
- FedPSA sets a new standard in the evaluation of asynchronous federated learning techniques.
Computer Science > Machine Learning arXiv:2602.15337 (cs) [Submitted on 17 Feb 2026] Title:FedPSA: Modeling Behavioral Staleness in Asynchronous Federated Learning Authors:Chaoyi Lu View a PDF of the paper titled FedPSA: Modeling Behavioral Staleness in Asynchronous Federated Learning, by Chaoyi Lu View PDF HTML (experimental) Abstract:Asynchronous Federated Learning (AFL) has emerged as a significant research area in recent years. By not waiting for slower clients and executing the training process concurrently, it achieves faster training speed compared to traditional federated learning. However, due to the staleness introduced by the asynchronous process, its performance may degrade in some scenarios. Existing methods often use the round difference between the current model and the global model as the sole measure of staleness, which is coarse-grained and lacks observation of the model itself, thereby limiting the performance ceiling of asynchronous methods. In this paper, we propose FedPSA (Parameter Sensitivity-based Asynchronous Federated Learning), a more fine-grained AFL framework that leverages parameter sensitivity to measure model obsolescence and establishes a dynamic momentum queue to assess the current training phase in real time, thereby adjusting the tolerance for outdated information dynamically. Extensive experiments on multiple datasets and comparisons with various methods demonstrate the superior performance of FedPSA, achieving up to 6.37\% improvement...