[2603.20036] Continual Learning as Shared-Manifold Continuation Under Compatible Shift
About this article
Abstract page for arXiv paper 2603.20036: Continual Learning as Shared-Manifold Continuation Under Compatible Shift
Computer Science > Machine Learning arXiv:2603.20036 (cs) [Submitted on 20 Mar 2026] Title:Continual Learning as Shared-Manifold Continuation Under Compatible Shift Authors:Henry J. Kobs View a PDF of the paper titled Continual Learning as Shared-Manifold Continuation Under Compatible Shift, by Henry J. Kobs View PDF HTML (experimental) Abstract:Continual learning methods usually preserve old behavior by regularizing parameters, matching old outputs, or replaying previous examples. These strategies can reduce forgetting, but they do not directly specify how the latent representation should evolve. We study a narrower geometric alternative for the regime where old and new data should remain on the same latent support: continual learning as continuation of a shared manifold. We instantiate this view within Support-Preserving Manifold Assimilation (SPMA) and evaluate a geometry-preserving variant, SPMA-OG, that combines sparse replay, output distillation, relational geometry preservation, local smoothing, and chart-assignment regularization on old anchors. On representative compatible-shift CIFAR10 and Tiny-ImageNet runs, SPMA-OG improves over sparse replay baselines in old-task retention and representation-preservation metrics while remaining competitive on new-task accuracy. On a controlled synthetic atlas-manifold benchmark, it achieves near-perfect anchor-geometry preservation while also improving new-task accuracy over replay. These results provide evidence that geometry...