[2602.13486] Preventing Rank Collapse in Federated Low-Rank Adaptation with Client Heterogeneity
Summary
This paper introduces raFLoRA, a method to prevent rank collapse in federated low-rank adaptation (FedLoRA) due to client heterogeneity, enhancing model performance and communication efficiency.
Why It Matters
The study addresses a critical issue in federated learning where client heterogeneity can lead to rank collapse, negatively impacting model performance. By proposing a novel aggregation method, the research contributes to improving the effectiveness of federated learning systems, which are increasingly important for privacy-preserving AI applications.
Key Takeaways
- Identifies rank collapse as a significant issue in heterogeneous FedLoRA.
- Proposes raFLoRA, a rank-partitioned aggregation method to enhance performance.
- Demonstrates improved model performance and communication efficiency through extensive experiments.
Computer Science > Machine Learning arXiv:2602.13486 (cs) [Submitted on 13 Feb 2026] Title:Preventing Rank Collapse in Federated Low-Rank Adaptation with Client Heterogeneity Authors:Fei Wu, Jia Hu, Geyong Min, Shiqiang Wang View a PDF of the paper titled Preventing Rank Collapse in Federated Low-Rank Adaptation with Client Heterogeneity, by Fei Wu and 3 other authors View PDF HTML (experimental) Abstract:Federated low-rank adaptation (FedLoRA) has facilitated communication-efficient and privacy-preserving fine-tuning of foundation models for downstream tasks. In practical federated learning scenarios, client heterogeneity in system resources and data distributions motivates heterogeneous LoRA ranks across clients. We identify a previously overlooked phenomenon in heterogeneous FedLoRA, termed rank collapse, where the energy of the global update concentrates on the minimum shared rank, resulting in suboptimal performance and high sensitivity to rank configurations. Through theoretical analysis, we reveal the root cause of rank collapse: a mismatch between rank-agnostic aggregation weights and rank-dependent client contributions, which systematically suppresses higher-rank updates at a geometric rate over rounds. Motivated by this insight, we propose raFLoRA, a rank-partitioned aggregation method that decomposes local updates into rank partitions and then aggregates each partition weighted by its effective client contributions. Extensive experiments across classification an...