[2602.20153] JUCAL: Jointly Calibrating Aleatoric and Epistemic Uncertainty in Classification Tasks

[2602.20153] JUCAL: Jointly Calibrating Aleatoric and Epistemic Uncertainty in Classification Tasks

arXiv - Machine Learning 4 min read Article

Summary

The paper presents JUCAL, a novel calibration method that jointly addresses aleatoric and epistemic uncertainty in classification tasks, outperforming existing techniques.

Why It Matters

Understanding and accurately calibrating uncertainties in machine learning models is crucial for reliable predictions. JUCAL offers a significant improvement over traditional methods, making it a valuable tool for practitioners in various classification tasks, enhancing model performance and efficiency.

Key Takeaways

  • JUCAL effectively balances aleatoric and epistemic uncertainties in classification models.
  • The method shows superior performance compared to state-of-the-art calibration techniques.
  • JUCAL can be applied to various ensemble classifiers with minimal computational overhead.
  • It reduces negative log-likelihood (NLL) and predictive set size significantly.
  • Using JUCAL can lead to lower inference costs, enhancing efficiency.

Statistics > Machine Learning arXiv:2602.20153 (stat) [Submitted on 23 Feb 2026] Title:JUCAL: Jointly Calibrating Aleatoric and Epistemic Uncertainty in Classification Tasks Authors:Jakob Heiss, Sören Lambrecht, Jakob Weissteiner, Hanna Wutte, Žan Žurič, Josef Teichmann, Bin Yu View a PDF of the paper titled JUCAL: Jointly Calibrating Aleatoric and Epistemic Uncertainty in Classification Tasks, by Jakob Heiss and 6 other authors View PDF HTML (experimental) Abstract:We study post-calibration uncertainty for trained ensembles of classifiers. Specifically, we consider both aleatoric (label noise) and epistemic (model) uncertainty. Among the most popular and widely used calibration methods in classification are temperature scaling (i.e., pool-then-calibrate) and conformal methods. However, the main shortcoming of these calibration methods is that they do not balance the proportion of aleatoric and epistemic uncertainty. Not balancing these uncertainties can severely misrepresent predictive uncertainty, leading to overconfident predictions in some input regions while being underconfident in others. To address this shortcoming, we present a simple but powerful calibration algorithm Joint Uncertainty Calibration (JUCAL) that jointly calibrates aleatoric and epistemic uncertainty. JUCAL jointly calibrates two constants to weight and scale epistemic and aleatoric uncertainties by optimizing the negative log-likelihood (NLL) on the validation/calibration dataset. JUCAL can be appli...

Related Articles

Machine Learning

[D] ICML Rebuttal Question

I am currently working on my response on the rebuttal acknowledgments for ICML and I doubting how to handle the strawman argument of that...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] ML researcher looking to switch to a product company.

Hey, I am an AI researcher currently working in a deep tech company as a data scientist. Prior to this, I was doing my PhD. My current ro...

Reddit - Machine Learning · 1 min ·
Machine Learning

Building behavioural response models of public figures using Brain scan data (Predict their next move using psychological modelling) [P]

Hey guys, I’m the same creator of Netryx V2, the geolocation tool. I’ve been working on something new called COGNEX. It learns how a pers...

Reddit - Machine Learning · 1 min ·
Machine Learning

[P] bitnet-edge: Ternary-weight CNNs ({-1,0,+1}) on MNIST and CIFAR-10, deployed to ESP32-S3 with zero multiplications

I built a pipeline that takes ternary-quantized CNNs from PyTorch training all the way to bare-metal inference on an ESP32-S3 microcontro...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime