[2209.14267] Less is More: Rethinking Few-Shot Learning and Recurrent Neural Nets
About this article
Abstract page for arXiv paper 2209.14267: Less is More: Rethinking Few-Shot Learning and Recurrent Neural Nets
Computer Science > Machine Learning arXiv:2209.14267 (cs) [Submitted on 28 Sep 2022 (v1), last revised 29 Mar 2026 (this version, v3)] Title:Less is More: Rethinking Few-Shot Learning and Recurrent Neural Nets Authors:Deborah Pereg, Martin Villiger, Brett Bouma, Polina Golland View a PDF of the paper titled Less is More: Rethinking Few-Shot Learning and Recurrent Neural Nets, by Deborah Pereg and 3 other authors View PDF HTML (experimental) Abstract:The statistical supervised learning framework assumes an input-output set with a joint probability distribution that is reliably represented by the training dataset. The learner is then required to output a prediction rule learned from the training dataset's input-output pairs. In this work, we provide meaningful insights into the asymptotic equipartition property (AEP) \citep{Shannon:1948} in the context of machine learning, and illuminate some of its potential ramifications for few-shot learning. We provide theoretical guarantees for reliable learning under the information-theoretic AEP, and for the generalization error with respect to the sample size. We then focus on a highly efficient recurrent neural net (RNN) framework and propose a reduced-entropy algorithm for few-shot learning. We also propose a mathematical intuition for the RNN as an approximation of a sparse coding solver. We verify the applicability, robustness, and computational efficiency of the proposed approach with image deblurring and optical coherence tomog...