[2503.21258] Learn by Reasoning: Analogical Weight Generation for Few-Shot Class-Incremental Learning
Summary
This paper presents a novel approach to Few-Shot Class-Incremental Learning (FSCIL) using an analogical generative method, enhancing model performance without fine-tuning parameters.
Why It Matters
The research addresses the challenges of FSCIL, where traditional methods struggle with limited data for new classes and retaining knowledge of old ones. By leveraging human-like analogical reasoning, this approach could significantly improve learning efficiency and accuracy in AI systems, which is crucial for applications in dynamic environments.
Key Takeaways
- Introduces the Brain-Inspired Analogical Generator (BiAG) for FSCIL.
- Utilizes three innovative modules: WSA, WPAA, and SCM for weight generation.
- Demonstrates superior accuracy on miniImageNet, CUB-200, and CIFAR-100 datasets compared to state-of-the-art methods.
Computer Science > Computer Vision and Pattern Recognition arXiv:2503.21258 (cs) [Submitted on 27 Mar 2025 (v1), last revised 23 Feb 2026 (this version, v2)] Title:Learn by Reasoning: Analogical Weight Generation for Few-Shot Class-Incremental Learning Authors:Jizhou Han, Chenhao Ding, Yuhang He, Songlin Dong, Qiang Wang, Xinyuan Gao, Yihong Gong View a PDF of the paper titled Learn by Reasoning: Analogical Weight Generation for Few-Shot Class-Incremental Learning, by Jizhou Han and 6 other authors View PDF HTML (experimental) Abstract:Few-shot class-incremental Learning (FSCIL) enables models to learn new classes from limited data while retaining performance on previously learned classes. Traditional FSCIL methods often require fine-tuning parameters with limited new class data and suffer from a separation between learning new classes and utilizing old knowledge. Inspired by the analogical learning mechanisms of the human brain, we propose a novel analogical generative method. Our approach includes the Brain-Inspired Analogical Generator (BiAG), which derives new class weights from existing classes without parameter fine-tuning during incremental stages. BiAG consists of three components: Weight Self-Attention Module (WSA), Weight & Prototype Analogical Attention Module (WPAA), and Semantic Conversion Module (SCM). SCM uses Neural Collapse theory for semantic conversion, WSA supplements new class weights, and WPAA computes analogies to generate new class weights. Experime...