[2602.19355] Active perception and disentangled representations allow continual, episodic zero and few-shot learning
Summary
This paper presents a Complementary Learning System (CLS) that enables continual, episodic zero and few-shot learning by utilizing active perception and disentangled representations to enhance generalization in machine learning models.
Why It Matters
The research addresses the challenge of generalization in machine learning, particularly in scenarios requiring rapid learning with minimal data. By proposing a system that balances fast, context-driven reasoning with structured generalization, it opens new pathways for robust continual learning, which is crucial for advancing AI applications in dynamic environments.
Key Takeaways
- Introduces a Complementary Learning System (CLS) for effective learning.
- Demonstrates the coexistence of fast reasoning and slow generalization.
- Addresses destructive interference in traditional learning models.
- Highlights the importance of disentangled representations for learning.
- Provides a framework for robust continual learning in AI.
Computer Science > Machine Learning arXiv:2602.19355 (cs) [Submitted on 22 Feb 2026] Title:Active perception and disentangled representations allow continual, episodic zero and few-shot learning Authors:David Rawlinson, Gideon Kowadlo View a PDF of the paper titled Active perception and disentangled representations allow continual, episodic zero and few-shot learning, by David Rawlinson and Gideon Kowadlo View PDF HTML (experimental) Abstract:Generalization is often regarded as an essential property of machine learning systems. However, perhaps not every component of a system needs to generalize. Training models for generalization typically produces entangled representations at the boundaries of entities or classes, which can lead to destructive interference when rapid, high-magnitude updates are required for continual or few-shot learning. Techniques for fast learning with non-interfering representations exist, but they generally fail to generalize. Here, we describe a Complementary Learning System (CLS) in which the fast learner entirely foregoes generalization in exchange for continual zero-shot and few-shot learning. Unlike most CLS approaches, which use episodic memory primarily for replay and consolidation, our fast, disentangled learner operates as a parallel reasoning system. The fast learner can overcome observation variability and uncertainty by leveraging a conventional slow, statistical learner within an active perception system: A contextual bias provided by t...