[2603.18596] Elastic Weight Consolidation Done Right for Continual Learning
About this article
Abstract page for arXiv paper 2603.18596: Elastic Weight Consolidation Done Right for Continual Learning
Computer Science > Machine Learning arXiv:2603.18596 (cs) [Submitted on 19 Mar 2026 (v1), last revised 24 Mar 2026 (this version, v2)] Title:Elastic Weight Consolidation Done Right for Continual Learning Authors:Xuan Liu, Xiaobin Chang View a PDF of the paper titled Elastic Weight Consolidation Done Right for Continual Learning, by Xuan Liu and 1 other authors View PDF HTML (experimental) Abstract:Weight regularization methods in continual learning (CL) alleviate catastrophic forgetting by assessing and penalizing changes to important model weights. Elastic Weight Consolidation (EWC) is a foundational and widely used approach within this framework that estimates weight importance based on gradients. However, it has consistently shown suboptimal performance. In this paper, we conduct a systematic analysis of importance estimation in EWC from a gradient-based perspective. For the first time, we find that EWC's reliance on the Fisher Information Matrix (FIM) results in gradient vanishing and inaccurate importance estimation in certain scenarios. Our analysis also reveals that Memory Aware Synapses (MAS), a variant of EWC, imposes unnecessary constraints on parameters irrelevant to prior tasks, termed the redundant protection. Consequently, both EWC and its variants exhibit fundamental misalignments in estimating weight importance, leading to inferior performance. To tackle these issues, we propose the Logits Reversal (LR) operation, a simple yet effective modification that re...