[2602.24264] Compositional Generalization Requires Linear, Orthogonal Representations in Vision Embedding Models

[2602.24264] Compositional Generalization Requires Linear, Orthogonal Representations in Vision Embedding Models

arXiv - Machine Learning 4 min read

About this article

Abstract page for arXiv paper 2602.24264: Compositional Generalization Requires Linear, Orthogonal Representations in Vision Embedding Models

Computer Science > Computer Vision and Pattern Recognition arXiv:2602.24264 (cs) [Submitted on 27 Feb 2026] Title:Compositional Generalization Requires Linear, Orthogonal Representations in Vision Embedding Models Authors:Arnas Uselis, Andrea Dittadi, Seong Joon Oh View a PDF of the paper titled Compositional Generalization Requires Linear, Orthogonal Representations in Vision Embedding Models, by Arnas Uselis and 2 other authors View PDF Abstract:Compositional generalization, the ability to recognize familiar parts in novel contexts, is a defining property of intelligent systems. Although modern models are trained on massive datasets, they still cover only a tiny fraction of the combinatorial space of possible inputs, raising the question of what structure representations must have to support generalization to unseen combinations. We formalize three desiderata for compositional generalization under standard training (divisibility, transferability, stability) and show they impose necessary geometric constraints: representations must decompose linearly into per-concept components, and these components must be orthogonal across concepts. This provides theoretical grounding for the Linear Representation Hypothesis: the linear structure widely observed in neural representations is a necessary consequence of compositional generalization. We further derive dimension bounds linking the number of composable concepts to the embedding geometry. Empirically, we evaluate these predict...

Originally published on March 02, 2026. Curated by AI News.

Related Articles

Machine Learning

[P] Unix philosophy for ML pipelines: modular, swappable stages with typed contracts

We built an open-source prototype that applies Unix philosophy to retrieval pipelines. Each stage (PII redaction, chunking, dedup, embedd...

Reddit - Machine Learning · 1 min ·
Machine Learning

Making an AI native sovereign computational stack

I’ve been working on a personal project that ended up becoming a kind of full computing stack: identity / trust protocol decentralized ch...

Reddit - Artificial Intelligence · 1 min ·
Llms

An attack class that passes every current LLM filter - no payload, no injection signature, no log trace

https://shapingrooms.com/research I published a paper today on something I've been calling postural manipulation. The short version: ordi...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

What tools are sr MLEs using? (clawdbot, openspec, wispr) [D]

I'm already blasting cursor, but I want to level up my output. I heard that these kind of AI tools and workflows are being asked in SF. W...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime