[2511.17439] InTAct: Interval-based Task Activation Consolidation for Continual Learning
Summary
The paper presents InTAct, a novel method for continual learning that mitigates catastrophic forgetting by using interval-based task activation consolidation, ensuring functional invariance at the neuron level.
Why It Matters
Continual learning is crucial for AI systems to adapt and retain knowledge over time. InTAct offers a more efficient approach to prevent catastrophic forgetting, which is a significant challenge in machine learning. This method could enhance the performance of AI models in dynamic environments, making it relevant for both researchers and practitioners in the field.
Key Takeaways
- InTAct enforces functional invariance at the neuron level to prevent catastrophic forgetting.
- The method identifies specific activation intervals for previous tasks, allowing flexible adaptation.
- It is more computationally efficient than traditional parameter-based constraints.
- InTAct is architecture-agnostic and integrates well with prompt-based methods.
- The approach achieves state-of-the-art performance on challenging benchmarks.
Computer Science > Machine Learning arXiv:2511.17439 (cs) [Submitted on 21 Nov 2025 (v1), last revised 23 Feb 2026 (this version, v2)] Title:InTAct: Interval-based Task Activation Consolidation for Continual Learning Authors:Patryk Krukowski, Jan Miksa, Piotr Helm, Jacek Tabor, Paweł Wawrzyński, Przemysław Spurek View a PDF of the paper titled InTAct: Interval-based Task Activation Consolidation for Continual Learning, by Patryk Krukowski and 5 other authors View PDF HTML (experimental) Abstract:Continual learning is a fundamental challenge in artificial intelligence that requires networks to acquire new knowledge while preserving previously learned representations. Despite the success of various approaches, most existing paradigms do not provide rigorous mathematical guarantees against catastrophic forgetting. Current methods that offer such guarantees primarily focus on analyzing the parameter space using \textit{interval arithmetic (IA)}, as seen in frameworks such as InterContiNet. However, restricting high-dimensional weight updates can be computationally expensive. In this work, we propose InTAct (Interval-based Task Activation Consolidation), a method that mitigates catastrophic forgetting by enforcing functional invariance at the neuron level. We identify specific activation intervals where previous tasks reside and constrain updates within these regions while allowing for flexible adaptation elsewhere. By ensuring that predictions remain stable within these nested...