[2603.04323] PTOPOFL: Privacy-Preserving Personalised Federated Learning via Persistent Homology

[2603.04323] PTOPOFL: Privacy-Preserving Personalised Federated Learning via Persistent Homology

arXiv - Machine Learning 4 min read

About this article

Abstract page for arXiv paper 2603.04323: PTOPOFL: Privacy-Preserving Personalised Federated Learning via Persistent Homology

Computer Science > Machine Learning arXiv:2603.04323 (cs) [Submitted on 4 Mar 2026] Title:PTOPOFL: Privacy-Preserving Personalised Federated Learning via Persistent Homology Authors:Kelly L Vomo-Donfack, Adryel Hoszu, Grégory Ginot, Ian Morilla View a PDF of the paper titled PTOPOFL: Privacy-Preserving Personalised Federated Learning via Persistent Homology, by Kelly L Vomo-Donfack and 3 other authors View PDF HTML (experimental) Abstract:Federated learning (FL) faces two structural tensions: gradient sharing enables data-reconstruction attacks, while non-IID client distributions degrade aggregation quality. We introduce PTOPOFL, a framework that addresses both challenges simultaneously by replacing gradient communication with topological descriptors derived from persistent homology (PH). Clients transmit only 48-dimensional PH feature vectors-compact shape summaries whose many-to-one structure makes inversion provably ill-posed-rather than model gradients. The server performs topology-guided personalised aggregation: clients are clustered by Wasserstein similarity between their PH diagrams, intra-cluster models are topology-weighted,and clusters are blended with a global consensus. We prove an information-contraction theorem showing that PH descriptors leak strictly less mutual information per sample than gradients under strongly convex loss functions, and we establish linear convergence of the Wasserstein-weighted aggregation scheme with an error floor strictly smaller t...

Originally published on March 05, 2026. Curated by AI News.

Related Articles

Llms

[P] I trained a language model from scratch for a low resource language and got it running fully on-device on Android (no GPU, demo)

Hi Everybody! I just wanted to share an update on a project I’ve been working on called BULaMU, a family of language models trained (20M,...

Reddit - Machine Learning · 1 min ·
UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
Ai Infrastructure

The traditional "app" might be a transitional form. What actually replaces it when AI becomes the primary interface?

Something I keep coming back to after 30 years in engineering: if AI becomes a primary way we interact with our data, the "app" as an org...

Reddit - Artificial Intelligence · 1 min ·
Ai Infrastructure

[P] fastrad: GPU-native radiomics library — 25× faster than PyRadiomics, 100% IBSI-compliant, all 8 feature classes

PyRadiomics is the de facto standard for radiomic feature extraction, but it's CPU-only and takes ~3 seconds per scan. At scale, that's a...

Reddit - Machine Learning · 1 min ·
More in Ai Infrastructure: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime