[2603.27269] From Foundation ECG Models to NISQ Learners: Distilling ECGFounder into a VQC Student
About this article
Abstract page for arXiv paper 2603.27269: From Foundation ECG Models to NISQ Learners: Distilling ECGFounder into a VQC Student
Quantum Physics arXiv:2603.27269 (quant-ph) [Submitted on 28 Mar 2026] Title:From Foundation ECG Models to NISQ Learners: Distilling ECGFounder into a VQC Student Authors:Giovanni dos Santos Franco, Felipe Mahlow, Ellison Fernando Cardoso, Felipe Fanchini View a PDF of the paper titled From Foundation ECG Models to NISQ Learners: Distilling ECGFounder into a VQC Student, by Giovanni dos Santos Franco and 3 other authors View PDF HTML (experimental) Abstract:Foundation models have recently improved electrocardiogram (ECG) representation learning, but their deployment can be limited by computational cost and latency constraints. In this work, we fine-tune ECGFounder as a high-capacity teacher for binary ECG classification on PTB-XL and the MIT-BIH Arrhythmia Database, and investigate whether knowledge distillation can transfer its predictive behavior to compact students. We evaluate two classical 1D students (ResNet-1D and a lightweight CNN-1D) and a quantum-ready pipeline that combines a convolutional autoencoder, which compresses 256-sample ECG windows into a low-dimensional latent representation, with a 6-qubit variational quantum circuit implemented in Qiskit and executed in a simulated backend. Across both datasets, the teacher provides the strongest overall performance, while distillation yields competitive students under a considerable reduction in trainable parameters. We further analyze the sensitivity of student performance to distillation settings, highlighting co...