[2602.15337] FedPSA: Modeling Behavioral Staleness in Asynchronous Federated Learning

[2602.15337] FedPSA: Modeling Behavioral Staleness in Asynchronous Federated Learning

arXiv - Machine Learning 3 min read Article

Summary

The paper presents FedPSA, a novel framework for Asynchronous Federated Learning that improves performance by dynamically measuring model staleness through parameter sensitivity.

Why It Matters

Asynchronous Federated Learning is crucial for enhancing training speed in machine learning. However, existing methods often overlook the nuances of model staleness, which can hinder performance. FedPSA addresses this gap, offering a more refined approach that could lead to significant advancements in federated learning applications.

Key Takeaways

  • FedPSA improves the measurement of model staleness in asynchronous federated learning.
  • The framework uses parameter sensitivity to enhance performance metrics.
  • Dynamic adjustments to the tolerance for outdated information lead to better training outcomes.
  • Experimental results show a performance improvement of up to 6.37% over baseline methods.
  • FedPSA sets a new standard in the evaluation of asynchronous federated learning techniques.

Computer Science > Machine Learning arXiv:2602.15337 (cs) [Submitted on 17 Feb 2026] Title:FedPSA: Modeling Behavioral Staleness in Asynchronous Federated Learning Authors:Chaoyi Lu View a PDF of the paper titled FedPSA: Modeling Behavioral Staleness in Asynchronous Federated Learning, by Chaoyi Lu View PDF HTML (experimental) Abstract:Asynchronous Federated Learning (AFL) has emerged as a significant research area in recent years. By not waiting for slower clients and executing the training process concurrently, it achieves faster training speed compared to traditional federated learning. However, due to the staleness introduced by the asynchronous process, its performance may degrade in some scenarios. Existing methods often use the round difference between the current model and the global model as the sole measure of staleness, which is coarse-grained and lacks observation of the model itself, thereby limiting the performance ceiling of asynchronous methods. In this paper, we propose FedPSA (Parameter Sensitivity-based Asynchronous Federated Learning), a more fine-grained AFL framework that leverages parameter sensitivity to measure model obsolescence and establishes a dynamic momentum queue to assess the current training phase in real time, thereby adjusting the tolerance for outdated information dynamically. Extensive experiments on multiple datasets and comparisons with various methods demonstrate the superior performance of FedPSA, achieving up to 6.37\% improvement...

Related Articles

White House and Anthropic hold 'productive' meeting amid fears over Mythos model
Machine Learning

White House and Anthropic hold 'productive' meeting amid fears over Mythos model

The discussion is a sign the AI firm's technology may be too critical for even the US government to do without.

AI Tools & Products · 4 min ·
Improving AI models’ ability to explain their predictions
Machine Learning

Improving AI models’ ability to explain their predictions

AI News - General · 9 min ·
Machine Learning

Is the wandb server down? [D]

I am unable to load any training progress or visualize my old runs. Does somebody else have the same issue with wandb? https://preview.re...

Reddit - Machine Learning · 1 min ·
UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime