[2509.12573] No Need for Learning to Defer? A Training Free Deferral Framework to Multiple Experts through Conformal Prediction
About this article
Abstract page for arXiv paper 2509.12573: No Need for Learning to Defer? A Training Free Deferral Framework to Multiple Experts through Conformal Prediction
Computer Science > Machine Learning arXiv:2509.12573 (cs) [Submitted on 16 Sep 2025 (v1), last revised 28 Mar 2026 (this version, v3)] Title:No Need for Learning to Defer? A Training Free Deferral Framework to Multiple Experts through Conformal Prediction Authors:Tim Bary, Benoît Macq, Louis Petit View a PDF of the paper titled No Need for Learning to Defer? A Training Free Deferral Framework to Multiple Experts through Conformal Prediction, by Tim Bary and 2 other authors View PDF HTML (experimental) Abstract:AI systems often struggle to provide reliable predictions across all inputs, motivating hybrid human-AI decision-making. Existing Learning to Defer (L2D) approaches address this by training models to selectively defer to human experts. However, these approaches require extensive training data annotated by all experts and are sensitive to changes in expert composition, necessitating costly retraining. We propose a training-free, model- and expert-agnostic framework for expert deferral based on conformal prediction. Our method leverages prediction sets from a conformal predictor to quantify label-specific uncertainty and selects the most suitable expert using a segregativity criterion, which measures how well an expert discriminates among plausible labels. Experiments across three models on CIFAR10-H and HAM10000 demonstrate that our method can reduce the number of training labels per expert by up to 91.3% while maintaining predictive accuracy in low-data regimes. Bein...