[2602.22432] LoBoost: Fast Model-Native Local Conformal Prediction for Gradient-Boosted Trees

[2602.22432] LoBoost: Fast Model-Native Local Conformal Prediction for Gradient-Boosted Trees

arXiv - Machine Learning 3 min read Article

Summary

LoBoost introduces a novel method for local conformal prediction in gradient-boosted trees, enhancing uncertainty quantification without retraining or auxiliary models.

Why It Matters

This research addresses the limitations of traditional conformal prediction methods, which often struggle with heteroscedasticity and data efficiency. By improving the calibration of predictions, LoBoost can significantly enhance the reliability of machine learning models in practical applications, making it a valuable contribution to the field of machine learning.

Key Takeaways

  • LoBoost leverages the leaf structure of gradient-boosted trees for efficient local conformal prediction.
  • It eliminates the need for retraining or auxiliary models, enhancing data efficiency.
  • The method shows competitive interval quality and improved test mean squared error (MSE) across various datasets.
  • Calibration speed is significantly increased, making it practical for real-time applications.
  • LoBoost's approach is particularly beneficial for handling heteroscedastic data.

Statistics > Machine Learning arXiv:2602.22432 (stat) [Submitted on 25 Feb 2026] Title:LoBoost: Fast Model-Native Local Conformal Prediction for Gradient-Boosted Trees Authors:Vagner Santos, Victor Coscrato, Luben Cabezas, Rafael Izbicki, Thiago Ramos View a PDF of the paper titled LoBoost: Fast Model-Native Local Conformal Prediction for Gradient-Boosted Trees, by Vagner Santos and 4 other authors View PDF HTML (experimental) Abstract:Gradient-boosted decision trees are among the strongest off-the-shelf predictors for tabular regression, but point predictions alone do not quantify uncertainty. Conformal prediction provides distribution-free marginal coverage, yet split conformal uses a single global residual quantile and can be poorly adaptive under heteroscedasticity. Methods that improve adaptivity typically fit auxiliary nuisance models or introduce additional data splits/partitions to learn the conformal score, increasing cost and reducing data efficiency. We propose LoBoost, a model-native local conformal method that reuses the fitted ensemble's leaf structure to define multiscale calibration groups. Each input is encoded by its sequence of visited leaves; at resolution level k, we group points by matching prefixes of leaf indices across the first k trees and calibrate residual quantiles within each group. LoBoost requires no retraining, auxiliary models, or extra splitting beyond the standard train/calibration split. Experiments show competitive interval quality, im...

Related Articles

Machine Learning

I tried building a memory-first AI… and ended up discovering smaller models can beat larger ones

Dataset Model Acc F1 Δ vs Log Δ vs Static Avg Params Peak Params Steps Infer ms Size Banking77-20 Logistic TF-IDF 92.37% 0.9230 +0.00pp +...

Reddit - Artificial Intelligence · 1 min ·
Llms

[D] Howcome Muon is only being used for Transformers?

Muon has quickly been adopted in LLM training, yet we don't see it being talked about in other contexts. Searches for Muon on ConvNets tu...

Reddit - Machine Learning · 1 min ·
Machine Learning

[P] Run Karpathy's Autoresearch for $0.44 instead of $24 — Open-source parallel evolution pipeline on SageMaker Spot

TL;DR: I built an open-source pipeline that runs Karpathy's autoresearch on SageMaker Spot instances — 25 autonomous ML experiments for $...

Reddit - Machine Learning · 1 min ·
Improving AI models’ ability to explain their predictions
Machine Learning

Improving AI models’ ability to explain their predictions

AI News - General · 9 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime