[2510.26083] Nirvana: A Specialized Generalist Model With Task-Aware Memory Mechanism
About this article
Abstract page for arXiv paper 2510.26083: Nirvana: A Specialized Generalist Model With Task-Aware Memory Mechanism
Computer Science > Machine Learning arXiv:2510.26083 (cs) [Submitted on 30 Oct 2025 (v1), last revised 8 Apr 2026 (this version, v2)] Title:Nirvana: A Specialized Generalist Model With Task-Aware Memory Mechanism Authors:Yuhua Jiang, Shuang Cheng, Yihao Liu, Ermo Hua, Che Jiang, Weigao Sun, Yu Cheng, Feifei Gao, Biqing Qi, Bowen Zhou View a PDF of the paper titled Nirvana: A Specialized Generalist Model With Task-Aware Memory Mechanism, by Yuhua Jiang and 9 other authors View PDF HTML (experimental) Abstract:Large Language Models (LLMs) excel at general language tasks but struggle in specialized domains. Specialized Generalist Models (SGMs) address this by preserving broad capabilities while adapting to target domains. However, existing architectures provide limited support for task-guided specialized memory mechanisms. In this work, we introduce Nirvana, an SGM featuring specialized memory, linear-time complexity, and test-time task information extraction. Central to Nirvana are: (1) Task-Aware Memory Trigger ($\textit{Trigger}$), which treats each input as a self-supervised fine-tuning task and adjusts task-related parameters on the fly; and (2) Specialized Memory Updater ($\textit{Updater}$), which dynamically consolidates task-relevant context. Nirvana matches or surpasses LLM baselines on general benchmarks and achieves the lowest perplexity across specialized domains including biomedicine, finance, and law. On the challenging task of Magnetic Resonance Imaging (MRI),...