[2603.04589] ECG-MoE: Mixture-of-Expert Electrocardiogram Foundation Model
About this article
Abstract page for arXiv paper 2603.04589: ECG-MoE: Mixture-of-Expert Electrocardiogram Foundation Model
Computer Science > Artificial Intelligence arXiv:2603.04589 (cs) [Submitted on 4 Mar 2026] Title:ECG-MoE: Mixture-of-Expert Electrocardiogram Foundation Model Authors:Yuhao Xu, Xiaoda Wang, Yi Wu, Wei Jin, Xiao Hu, Carl Yang View a PDF of the paper titled ECG-MoE: Mixture-of-Expert Electrocardiogram Foundation Model, by Yuhao Xu and Xiaoda Wang and Yi Wu and Wei Jin and Xiao Hu and Carl Yang View PDF HTML (experimental) Abstract:Electrocardiography (ECG) analysis is crucial for cardiac diagnosis, yet existing foundation models often fail to capture the periodicity and diverse features required for varied clinical tasks. We propose ECG-MoE, a hybrid architecture that integrates multi-model temporal features with a cardiac period-aware expert module. Our approach uses a dual-path Mixture-of-Experts to separately model beat-level morphology and rhythm, combined with a hierarchical fusion network using LoRA for efficient inference. Evaluated on five public clinical tasks, ECG-MoE achieves state-of-the-art performance with 40% faster inference than multi-task baselines. Subjects: Artificial Intelligence (cs.AI) Cite as: arXiv:2603.04589 [cs.AI] (or arXiv:2603.04589v1 [cs.AI] for this version) https://doi.org/10.48550/arXiv.2603.04589 Focus to learn more arXiv-issued DOI via DataCite (pending registration) Submission history From: Yuhao Xu [view email] [v1] Wed, 4 Mar 2026 20:36:05 UTC (559 KB) Full-text links: Access Paper: View a PDF of the paper titled ECG-MoE: Mixture-of...