[2503.21258] Learn by Reasoning: Analogical Weight Generation for Few-Shot Class-Incremental Learning

[2503.21258] Learn by Reasoning: Analogical Weight Generation for Few-Shot Class-Incremental Learning

arXiv - AI 4 min read Article

Summary

This paper presents a novel approach to Few-Shot Class-Incremental Learning (FSCIL) using an analogical generative method, enhancing model performance without fine-tuning parameters.

Why It Matters

The research addresses the challenges of FSCIL, where traditional methods struggle with limited data for new classes and retaining knowledge of old ones. By leveraging human-like analogical reasoning, this approach could significantly improve learning efficiency and accuracy in AI systems, which is crucial for applications in dynamic environments.

Key Takeaways

  • Introduces the Brain-Inspired Analogical Generator (BiAG) for FSCIL.
  • Utilizes three innovative modules: WSA, WPAA, and SCM for weight generation.
  • Demonstrates superior accuracy on miniImageNet, CUB-200, and CIFAR-100 datasets compared to state-of-the-art methods.

Computer Science > Computer Vision and Pattern Recognition arXiv:2503.21258 (cs) [Submitted on 27 Mar 2025 (v1), last revised 23 Feb 2026 (this version, v2)] Title:Learn by Reasoning: Analogical Weight Generation for Few-Shot Class-Incremental Learning Authors:Jizhou Han, Chenhao Ding, Yuhang He, Songlin Dong, Qiang Wang, Xinyuan Gao, Yihong Gong View a PDF of the paper titled Learn by Reasoning: Analogical Weight Generation for Few-Shot Class-Incremental Learning, by Jizhou Han and 6 other authors View PDF HTML (experimental) Abstract:Few-shot class-incremental Learning (FSCIL) enables models to learn new classes from limited data while retaining performance on previously learned classes. Traditional FSCIL methods often require fine-tuning parameters with limited new class data and suffer from a separation between learning new classes and utilizing old knowledge. Inspired by the analogical learning mechanisms of the human brain, we propose a novel analogical generative method. Our approach includes the Brain-Inspired Analogical Generator (BiAG), which derives new class weights from existing classes without parameter fine-tuning during incremental stages. BiAG consists of three components: Weight Self-Attention Module (WSA), Weight & Prototype Analogical Attention Module (WPAA), and Semantic Conversion Module (SCM). SCM uses Neural Collapse theory for semantic conversion, WSA supplements new class weights, and WPAA computes analogies to generate new class weights. Experime...

Related Articles

UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
Improving AI models’ ability to explain their predictions
Machine Learning

Improving AI models’ ability to explain their predictions

AI News - General · 9 min ·
[2603.23899] SM-Net: Learning a Continuous Spectral Manifold from Multiple Stellar Libraries
Machine Learning

[2603.23899] SM-Net: Learning a Continuous Spectral Manifold from Multiple Stellar Libraries

Abstract page for arXiv paper 2603.23899: SM-Net: Learning a Continuous Spectral Manifold from Multiple Stellar Libraries

arXiv - AI · 4 min ·
[2603.16629] MLLM-based Textual Explanations for Face Comparison
Llms

[2603.16629] MLLM-based Textual Explanations for Face Comparison

Abstract page for arXiv paper 2603.16629: MLLM-based Textual Explanations for Face Comparison

arXiv - AI · 4 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime