[2509.24302] LEAF: Language-EEG Aligned Foundation Model for Brain-Computer Interfaces
About this article
Abstract page for arXiv paper 2509.24302: LEAF: Language-EEG Aligned Foundation Model for Brain-Computer Interfaces
Computer Science > Machine Learning arXiv:2509.24302 (cs) [Submitted on 29 Sep 2025 (v1), last revised 23 Mar 2026 (this version, v2)] Title:LEAF: Language-EEG Aligned Foundation Model for Brain-Computer Interfaces Authors:Muyun Jiang, Shuailei Zhang, Zhenjie Yang, Mengjun Wu, Weibang Jiang, Zhiwei Guo, Wei Zhang, Rui Liu, Shangen Zhang, Yong Li, Yi Ding, Cuntai Guan View a PDF of the paper titled LEAF: Language-EEG Aligned Foundation Model for Brain-Computer Interfaces, by Muyun Jiang and 11 other authors View PDF HTML (experimental) Abstract:Recent advances in electroencephalography (EEG) foundation models, which capture transferable EEG representations, have greatly accelerated the development of brain-computer interfaces (BCIs). However, existing approaches still struggle to incorporate language instructions as prior constraints for EEG representation learning, limiting their ability to leverage the semantic knowledge inherent in language to unify different labels and tasks. To address this challenge, we present LEAF, a foundation model for EEG--Language Alignment with Semantic Task Instruction and Querying. LEAF integrates task-aware semantic guidance to produce structured and linguistically aligned EEG embeddings, thereby enhancing decoding robustness and transferability. In the pretraining stage, we introduce a joint Spectral--Temporal Reconstruction (STR) framework that captures the coupled spectral rhythms and temporal dynamics of EEG signals. STR applies randomiz...