[2603.21170] Pruned Adaptation Modules: A Simple yet Strong Baseline for Continual Foundation Models
About this article
Abstract page for arXiv paper 2603.21170: Pruned Adaptation Modules: A Simple yet Strong Baseline for Continual Foundation Models
Computer Science > Machine Learning arXiv:2603.21170 (cs) [Submitted on 22 Mar 2026] Title:Pruned Adaptation Modules: A Simple yet Strong Baseline for Continual Foundation Models Authors:Elif Ceren Gok Yildirim, Murat Onur Yildirim, Joaquin Vanschoren View a PDF of the paper titled Pruned Adaptation Modules: A Simple yet Strong Baseline for Continual Foundation Models, by Elif Ceren Gok Yildirim and 2 other authors View PDF HTML (experimental) Abstract:The continual learning literature has rapidly shifted from traditional class incremental learning (CIL) techniques to foundation model (FM)-based CIL methods without a clear understanding of how these newer approaches compare to strong, lightweight convolutional baselines. This abrupt transition has created a substantial methodological gap, making it difficult to assess whether recent FM-based CIL progress reflects genuine advances or merely the absence of rigorous baselines. To address this gap, we introduce Pruned Adaptation Modules (PAM), a simple yet effective method that freezes the vast majority of the pre-trained ResNet while enabling scalable continual adaptation through sparse task-specific layers. PAM yields up to a ~5x reduction in trainable parameters and a ~6x reduction in total parameters, significantly reducing the cost of continual updates. Across diverse benchmarks, PAM consistently mitigates catastrophic forgetting and outperforms state-of-the-art FM-based CIL approaches. Our findings position PAM as a stro...