[2602.18866] Boosting for Vector-Valued Prediction and Conditional Density Estimation

[2602.18866] Boosting for Vector-Valued Prediction and Conditional Density Estimation

arXiv - Machine Learning 4 min read Article

Summary

This paper explores boosting techniques for vector-valued prediction and conditional density estimation, addressing theoretical gaps in aggregation beyond scalar losses.

Why It Matters

Understanding boosting in vector-valued contexts enhances predictive modeling capabilities in machine learning. This research provides a theoretical foundation for improving algorithms, which can lead to better performance in various applications, including structured prediction tasks.

Key Takeaways

  • Introduces $(eta,eta)$-boostability as a stability condition for aggregation.
  • Demonstrates that geometric median aggregation can achieve boostability across various divergences.
  • Proposes a new boosting framework, GeoMedBoost, which generalizes existing algorithms.
  • Characterizes boostability under common divergences, revealing key distinctions in performance.
  • Provides insights into handling KL divergence indirectly through Hellinger distance.

Computer Science > Machine Learning arXiv:2602.18866 (cs) [Submitted on 21 Feb 2026] Title:Boosting for Vector-Valued Prediction and Conditional Density Estimation Authors:Jian Qian, Shu Ge View a PDF of the paper titled Boosting for Vector-Valued Prediction and Conditional Density Estimation, by Jian Qian and Shu Ge View PDF HTML (experimental) Abstract:Despite the widespread use of boosting in structured prediction, a general theoretical understanding of aggregation beyond scalar losses remains incomplete. We study vector-valued and conditional density prediction under general divergences and identify stability conditions under which aggregation amplifies weak guarantees into strong ones. We formalize this stability property as \emph{$(\alpha,\beta)$-boostability}. We show that geometric median aggregation achieves $(\alpha,\beta)$-boostability for a broad class of divergences, with tradeoffs that depend on the underlying geometry. For vector-valued prediction and conditional density estimation, we characterize boostability under common divergences ($\ell_1$, $\ell_2$, $\TV$, and $\Hel$) with geometric median, revealing a sharp distinction between dimension-dependent and dimension-free regimes. We further show that while KL divergence is not directly boostable via geometric median aggregation, it can be handled indirectly through boostability under Hellinger distance. Building on these structural results, we propose a generic boosting framework \textsc{GeoMedBoost} based...

Related Articles

UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
Improving AI models’ ability to explain their predictions
Machine Learning

Improving AI models’ ability to explain their predictions

AI News - General · 9 min ·
Machine Learning

[P] SpeakFlow - AI Dialogue Practice Coach with GLM 5.1

Built SpeakFlow for the Z.AI Builder Series hackathon. AI dialogue practice coach that evaluates your spoken responses in real-time. Two ...

Reddit - Machine Learning · 1 min ·
Machine Learning

[R] ICML Anonymized git repos for rebuttal

A number of the papers I'm reviewing for have submitted additional figures and code through anonymized git repos (e.g. https://anonymous....

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime