[2602.20585] Characterizing Online and Private Learnability under Distributional Constraints via Generalized Smoothness
Summary
This paper explores the conditions under which learning is achievable in online and private settings, focusing on generalized smoothness and its implications for sample complexity and regret bounds.
Why It Matters
Understanding the minimal assumptions needed for effective learning under distributional constraints is crucial for advancing machine learning theory. This research bridges gaps between traditional learning paradigms and offers insights into online learning and differential privacy, making it relevant for both theoretical and practical applications in AI.
Key Takeaways
- Generalized smoothness characterizes learnability under distributional adversaries.
- The paper provides universal algorithms achieving low regret without prior knowledge of the distribution family.
- Refined bounds are established when the distribution family is known, enhancing understanding of sample complexity.
Statistics > Machine Learning arXiv:2602.20585 (stat) [Submitted on 24 Feb 2026] Title:Characterizing Online and Private Learnability under Distributional Constraints via Generalized Smoothness Authors:Moïse Blanchard, Abhishek Shetty, Alexander Rakhlin View a PDF of the paper titled Characterizing Online and Private Learnability under Distributional Constraints via Generalized Smoothness, by Mo\"ise Blanchard and 2 other authors View PDF HTML (experimental) Abstract:Understanding minimal assumptions that enable learning and generalization is perhaps the central question of learning theory. Several celebrated results in statistical learning theory, such as the VC theorem and Littlestone's characterization of online learnability, establish conditions on the hypothesis class that allow for learning under independent data and adversarial data, respectively. Building upon recent work bridging these extremes, we study sequential decision making under distributional adversaries that can adaptively choose data-generating distributions from a fixed family $U$ and ask when such problems are learnable with sample complexity that behaves like the favorable independent case. We provide a near complete characterization of families $U$ that admit learnability in terms of a notion known as generalized smoothness i.e. a distribution family admits VC-dimension-dependent regret bounds for every finite-VC hypothesis class if and only if it is generalized smooth. Further, we give universal al...