[2603.26299] Preference-Aligned LoRA Merging: Preserving Subspace Coverage and Addressing Directional Anisotropy

[2603.26299] Preference-Aligned LoRA Merging: Preserving Subspace Coverage and Addressing Directional Anisotropy

arXiv - AI 3 min read

About this article

Abstract page for arXiv paper 2603.26299: Preference-Aligned LoRA Merging: Preserving Subspace Coverage and Addressing Directional Anisotropy

Computer Science > Computer Vision and Pattern Recognition arXiv:2603.26299 (cs) [Submitted on 27 Mar 2026] Title:Preference-Aligned LoRA Merging: Preserving Subspace Coverage and Addressing Directional Anisotropy Authors:Wooseong Jeong, Wonyoung Lee, Kuk-Jin Yoon View a PDF of the paper titled Preference-Aligned LoRA Merging: Preserving Subspace Coverage and Addressing Directional Anisotropy, by Wooseong Jeong and 2 other authors View PDF HTML (experimental) Abstract:Merging multiple Low-Rank Adaptation (LoRA) modules is promising for constructing general-purpose systems, yet challenging because LoRA update directions span different subspaces and contribute unevenly. When merged naively, such mismatches can weaken the directions most critical to certain task losses while overemphasizing relatively less important ones, ultimately reducing the model's ability to represent all tasks faithfully. We revisit this problem through two perspectives: subspace coverage, which captures how broadly LoRA directions cover diverse representational directions, and anisotropy, which reflects the imbalance of influence across those directions. We propose TARA-Merging (Task-Rank Anisotropy Alignment), which aligns merging weights using a preference-weighted cross-entropy pseudo-loss while preserving task-relevant LoRA subspaces. This ensures broad subspace coverage and mitigates anisotropy via direction-wise reweighting. Across eight vision and six NLI benchmarks, TARA-Merging consistently o...

Originally published on March 30, 2026. Curated by AI News.

Related Articles

UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
Llms

Depth-first pruning seems to transfer from GPT-2 to Llama (unexpectedly well)

TL;DR: Removing the right transformer layers (instead of shrinking all layers) gives smaller, faster models with minimal quality loss — a...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

If frontier AI labs have unlimited shovels, what's stopping them from building everything?

I found myself explaining AI tokens to my mom over the weekend. At first I related them to building bricks: blocks of data the model uses...

Reddit - Artificial Intelligence · 1 min ·
[2603.16790] InCoder-32B: Code Foundation Model for Industrial Scenarios
Llms

[2603.16790] InCoder-32B: Code Foundation Model for Industrial Scenarios

Abstract page for arXiv paper 2603.16790: InCoder-32B: Code Foundation Model for Industrial Scenarios

arXiv - AI · 4 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime