[2602.21104] Ski Rental with Distributional Predictions of Unknown Quality

[2602.21104] Ski Rental with Distributional Predictions of Unknown Quality

arXiv - Machine Learning 4 min read Article

Summary

This paper explores the ski rental problem within the framework of distributional predictions, presenting an algorithm that minimizes expected costs based on predicted distributions of ski days.

Why It Matters

Understanding distributional predictions in online algorithms can significantly enhance decision-making processes in uncertain environments. This research offers a robust approach to minimize costs without requiring prior knowledge of prediction errors, making it relevant for fields like machine learning and operations research.

Key Takeaways

  • Introduces a novel algorithm for the ski rental problem using distributional predictions.
  • Achieves expected cost minimization with bounds on prediction errors.
  • Demonstrates consistency and robustness without needing prior error knowledge.
  • Provides lower bounds that indicate the tightness of the proposed algorithm.
  • Impacts decision-making in uncertain environments by improving cost efficiency.

Computer Science > Machine Learning arXiv:2602.21104 (cs) [Submitted on 24 Feb 2026] Title:Ski Rental with Distributional Predictions of Unknown Quality Authors:Qiming Cui, Michael Dinitz View a PDF of the paper titled Ski Rental with Distributional Predictions of Unknown Quality, by Qiming Cui and 1 other authors View PDF HTML (experimental) Abstract:We revisit the central online problem of ski rental in the "algorithms with predictions" framework from the point of view of distributional predictions. Ski rental was one of the first problems to be studied with predictions, where a natural prediction is simply the number of ski days. But it is both more natural and potentially more powerful to think of a prediction as a distribution p-hat over the ski days. If the true number of ski days is drawn from some true (but unknown) distribution p, then we show as our main result that there is an algorithm with expected cost at most OPT + O(min(max({eta}, 1) * sqrt(b), b log b)), where OPT is the expected cost of the optimal policy for the true distribution p, b is the cost of buying, and {eta} is the Earth Mover's (Wasserstein-1) distance between p and p-hat. Note that when {eta} < o(sqrt(b)) this gives additive loss less than b (the trivial bound), and when {eta} is arbitrarily large (corresponding to an extremely inaccurate prediction) we still do not pay more than O(b log b) additive loss. An implication of these bounds is that our algorithm has consistency O(sqrt(b)) (additive...

Related Articles

UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
Machine Learning

AI assistants are optimized to seem helpful. That is not the same thing as being helpful.

RLHF trains models on human feedback. Humans rate responses they like. And it turns out humans consistently rate confident, fluent, agree...

Reddit - Artificial Intelligence · 1 min ·
Llms

wtf bro did what? arc 3 2026

The Physarum Explorer is a high-speed, bio-inspired neural model designed specifically for ARC geometry. Here is the snapshot of its curr...

Reddit - Artificial Intelligence · 1 min ·
Meta Pauses Work With Mercor After Data Breach Puts AI Industry Secrets at Risk | WIRED
Machine Learning

Meta Pauses Work With Mercor After Data Breach Puts AI Industry Secrets at Risk | WIRED

Major AI labs are investigating a security incident that impacted Mercor, a leading data vendor. The incident could have exposed key data...

Wired - AI · 6 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime