[2602.16216] UCTECG-Net: Uncertainty-aware Convolution Transformer ECG Network for Arrhythmia Detection

[2602.16216] UCTECG-Net: Uncertainty-aware Convolution Transformer ECG Network for Arrhythmia Detection

arXiv - AI 3 min read Article

Summary

The paper presents UCTECG-Net, an innovative uncertainty-aware convolution transformer network for improved ECG classification, achieving high accuracy in arrhythmia detection.

Why It Matters

This research addresses the critical need for reliable ECG classification in medical settings, where understanding prediction uncertainty is essential for patient safety. By integrating advanced uncertainty quantification methods, UCTECG-Net enhances decision support systems in healthcare.

Key Takeaways

  • UCTECG-Net outperforms traditional models like LSTM and CNN1D in ECG classification accuracy.
  • The model achieves up to 98.58% accuracy on the MIT-BIH dataset and 99.14% on the PTB dataset.
  • Incorporates three uncertainty quantification methods to improve reliability in predictions.
  • Provides a stronger basis for risk-aware ECG decision support in clinical settings.
  • Demonstrates the importance of uncertainty awareness in deep learning applications for healthcare.

Computer Science > Machine Learning arXiv:2602.16216 (cs) [Submitted on 18 Feb 2026] Title:UCTECG-Net: Uncertainty-aware Convolution Transformer ECG Network for Arrhythmia Detection Authors:Hamzeh Asgharnezhad, Pegah Tabarisaadi, Abbas Khosravi, Roohallah Alizadehsani, U. Rajendra Acharya View a PDF of the paper titled UCTECG-Net: Uncertainty-aware Convolution Transformer ECG Network for Arrhythmia Detection, by Hamzeh Asgharnezhad and 4 other authors View PDF HTML (experimental) Abstract:Deep learning has improved automated electrocardiogram (ECG) classification, but limited insight into prediction reliability hinders its use in safety-critical settings. This paper proposes UCTECG-Net, an uncertainty-aware hybrid architecture that combines one-dimensional convolutions and Transformer encoders to process raw ECG signals and their spectrograms jointly. Evaluated on the MIT-BIH Arrhythmia and PTB Diagnostic datasets, UCTECG-Net outperforms LSTM, CNN1D, and Transformer baselines in terms of accuracy, precision, recall and F1 score, achieving up to 98.58% accuracy on MIT-BIH and 99.14% on PTB. To assess predictive reliability, we integrate three uncertainty quantification methods (Monte Carlo Dropout, Deep Ensembles, and Ensemble Monte Carlo Dropout) into all models and analyze their behavior using an uncertainty-aware confusion matrix and derived metrics. The results show that UCTECG-Net, particularly with Ensemble or EMCD, provides more reliable and better-aligned uncertaint...

Related Articles

Machine Learning

Finally Abliterated Sarvam 30B and 105B!

I abliterated Sarvam-30B and 105B - India's first multilingual MoE reasoning models - and found something interesting along the way! Reas...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

BANKING77-77: New best of 94.61% on the official test set (+0.13pp) over our previous tests 94.48%.

Hi everyone, Just wanted to share a small but hard-won milestone. After a long plateau at 94.48%, we’ve pushed the official BANKING77-77 ...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

Free tool I built to score dataset quality (LQS) — feedback welcome [D]

We built a Label Quality Score (LQS) system for our dataset marketplace and opened it up as a free standalone tool. Upload a dataset → ge...

Reddit - Machine Learning · 1 min ·
Meta’s New AI Model Gives Mark Zuckerberg a Seat at the Big Kid’s Table | WIRED
Machine Learning

Meta’s New AI Model Gives Mark Zuckerberg a Seat at the Big Kid’s Table | WIRED

Muse Spark is Meta’s first model since its AI reboot, and the benchmarks suggest formidable performance.

Wired - AI · 6 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime