[2602.22269] CQSA: Byzantine-robust Clustered Quantum Secure Aggregation in Federated Learning

[2602.22269] CQSA: Byzantine-robust Clustered Quantum Secure Aggregation in Federated Learning

arXiv - Machine Learning 4 min read Article

Summary

The paper presents Clustered Quantum Secure Aggregation (CQSA), a novel framework for Byzantine-robust secure aggregation in federated learning, addressing vulnerabilities in current quantum secure aggregation protocols.

Why It Matters

As federated learning becomes increasingly prevalent, ensuring the security and integrity of model updates is crucial. CQSA offers a solution to the challenges posed by Byzantine attacks, enhancing the robustness of quantum-assisted federated learning systems, which is vital for applications requiring high data privacy and security.

Key Takeaways

  • CQSA partitions clients into clusters for localized quantum aggregation, improving fidelity and robustness.
  • The framework addresses the limitations of existing quantum secure aggregation protocols, particularly in large-scale scenarios.
  • Statistical analysis methods are employed to identify malicious contributions from Byzantine clients.

Computer Science > Machine Learning arXiv:2602.22269 (cs) [Submitted on 25 Feb 2026] Title:CQSA: Byzantine-robust Clustered Quantum Secure Aggregation in Federated Learning Authors:Arnab Nath, Harsh Kasyap View a PDF of the paper titled CQSA: Byzantine-robust Clustered Quantum Secure Aggregation in Federated Learning, by Arnab Nath and 1 other authors View PDF HTML (experimental) Abstract:Federated Learning (FL) enables collaborative model training without sharing raw data. However, shared local model updates remain vulnerable to inference and poisoning attacks. Secure aggregation schemes have been proposed to mitigate these attacks. In this work, we aim to understand how these techniques are implemented in quantum-assisted FL. Quantum Secure Aggregation (QSA) has been proposed, offering information-theoretic privacy by encoding client updates into the global phase of multipartite entangled states. Existing QSA protocols, however, rely on a single global Greenberger-Horne-Zeilinger (GHZ) state shared among all participating clients. This design poses fundamental challenges: fidelity of large-scale GHZ states deteriorates rapidly with the increasing number of clients; and (ii) the global aggregation prevents the detection of Byzantine clients. We propose Clustered Quantum Secure Aggregation (CQSA), a modular aggregation framework that reconciles the physical constraints of near-term quantum hardware along with the need for Byzantine-robustness in FL. CQSA randomly partition...

Related Articles

Machine Learning

[R] Fine-tuning services report

If you have some data and want to train or run a small custom model but don't have powerful enough hardware for training, fine-tuning ser...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] Does ML have a "bible"/reference textbook at the Intermediate/Advanced level?

Hello, everyone! This is my first time posting here and I apologise if the question is, perhaps, a bit too basic for this sub-reddit. A b...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] ICML 2026 review policy debate: 100 responses suggest Policy B may score higher, while Policy A shows higher confidence

A week ago I made a thread asking whether ICML 2026’s review policy might have affected review outcomes, especially whether Policy A pape...

Reddit - Machine Learning · 1 min ·
Nomadic raises $8.4 million to wrangle the data pouring off autonomous vehicles | TechCrunch
Machine Learning

Nomadic raises $8.4 million to wrangle the data pouring off autonomous vehicles | TechCrunch

The company turns footage from robots into structured, searchable datasets with a deep learning model.

TechCrunch - AI · 6 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime