[2603.16739] SpecMoE: Spectral Mixture-of-Experts Foundation Model for Cross-Species EEG Decoding
About this article
Abstract page for arXiv paper 2603.16739: SpecMoE: Spectral Mixture-of-Experts Foundation Model for Cross-Species EEG Decoding
Computer Science > Machine Learning arXiv:2603.16739 (cs) [Submitted on 17 Mar 2026 (v1), last revised 30 Mar 2026 (this version, v2)] Title:SpecMoE: Spectral Mixture-of-Experts Foundation Model for Cross-Species EEG Decoding Authors:Davy Darankoum, Chloé Habermacher, Julien Volle, Sergei Grudinin View a PDF of the paper titled SpecMoE: Spectral Mixture-of-Experts Foundation Model for Cross-Species EEG Decoding, by Davy Darankoum and 3 other authors View PDF HTML (experimental) Abstract:Decoding the orchestration of neural activity in electroencephalography (EEG) signals is a central challenge in bridging neuroscience with artificial intelligence. Foundation models have made strides in generalized EEG decoding, yet many existing frameworks primarily relying on separate temporal and spectral masking of raw signals during self-supervised pretraining. Such strategies often tend to bias learning toward high-frequency oscillations, as low-frequency rhythmic patterns can be easily inferred from the unmasked signal. We introduce a foundation model that utilizes a novel Gaussian-smoothed masking scheme applied to short-time Fourier transform (STFT) maps. By jointly applying time, frequency, and time-frequency Gaussian masks, we make the reconstruction task much more challenging, forcing the model to learn intricate neural patterns across both high- and low-frequency domains. To effectively recover signals under this aggressive masking strategy, we design SpecHi-Net, a U-shaped hie...