[2603.22727] Spiking Personalized Federated Learning for Brain-Computer Interface-Enabled Immersive Communication
About this article
Abstract page for arXiv paper 2603.22727: Spiking Personalized Federated Learning for Brain-Computer Interface-Enabled Immersive Communication
Computer Science > Machine Learning arXiv:2603.22727 (cs) [Submitted on 24 Mar 2026] Title:Spiking Personalized Federated Learning for Brain-Computer Interface-Enabled Immersive Communication Authors:Chen Shang, Dinh Thai Hoang, Diep N. Nguyen, Jiadong Yu View a PDF of the paper titled Spiking Personalized Federated Learning for Brain-Computer Interface-Enabled Immersive Communication, by Chen Shang and 3 other authors View PDF HTML (experimental) Abstract:This work proposes a novel immersive communication framework that leverages brain-computer interface (BCI) to acquire brain signals for inferring user-centric states (e.g., intention and perception-related discomfort), thereby enabling more personalized and robust immersive adaptation under strong individual variability. Specifically, we develop a personalized federated learning (PFL) model to analyze and process the collected brain signals, which not only accommodates neurodiverse brain-signal data but also prevents the leakage of sensitive brain-signal information. To address the energy bottleneck of continual on-device learning and inference on energy-limited immersive terminals (e.g., head-mounted display), we further embed spiking neural networks (SNNs) into the PFL. By exploiting sparse, event-driven spike computation, the SNN-enabled PFL reduces the computation and energy cost of training and inference while maintaining competitive personalization performance. Experiments on real brain-signal dataset demonstrate t...