[2603.26317] Label-Free Cross-Task LoRA Merging with Null-Space Compression

[2603.26317] Label-Free Cross-Task LoRA Merging with Null-Space Compression

arXiv - AI 3 min read

About this article

Abstract page for arXiv paper 2603.26317: Label-Free Cross-Task LoRA Merging with Null-Space Compression

Computer Science > Computer Vision and Pattern Recognition arXiv:2603.26317 (cs) [Submitted on 27 Mar 2026] Title:Label-Free Cross-Task LoRA Merging with Null-Space Compression Authors:Wonyoung Lee, Wooseong Jeong, Kuk-Jin Yoon View a PDF of the paper titled Label-Free Cross-Task LoRA Merging with Null-Space Compression, by Wonyoung Lee and 2 other authors View PDF HTML (experimental) Abstract:Model merging combines independently fine-tuned checkpoints without joint multi-task training. In the era of foundation-model, fine-tuning with Low-Rank Adaptation (LoRA) is prevalent, making LoRA merging a promising target. Existing approaches can work in homogeneous settings where all target tasks are classification but often fail when tasks span classification and regression. Approaches using entropy-based surrogates do not apply to regression and are costly for large language models due to long token sequences. We introduce Null-Space Compression (NSC) Merging, a label-free, output-agnostic method that sets merge weights from adapter geometry. Our key observation is that during LoRA finetuning the down-projection factor $A$ in $\Delta W = BA$ compresses its null space, and the compression correlates with performance. NSC uses this as an optimization signal for merging that can generalize across classification, regression, and sequence generation. NSC achieves state-of-the-art performance across twenty heterogeneous vision tasks with balanced gains where prior methods overfit subs...

Originally published on March 30, 2026. Curated by AI News.

Related Articles

UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
Llms

Depth-first pruning seems to transfer from GPT-2 to Llama (unexpectedly well)

TL;DR: Removing the right transformer layers (instead of shrinking all layers) gives smaller, faster models with minimal quality loss — a...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

If frontier AI labs have unlimited shovels, what's stopping them from building everything?

I found myself explaining AI tokens to my mom over the weekend. At first I related them to building bricks: blocks of data the model uses...

Reddit - Artificial Intelligence · 1 min ·
[2603.16790] InCoder-32B: Code Foundation Model for Industrial Scenarios
Llms

[2603.16790] InCoder-32B: Code Foundation Model for Industrial Scenarios

Abstract page for arXiv paper 2603.16790: InCoder-32B: Code Foundation Model for Industrial Scenarios

arXiv - AI · 4 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime