[2604.05414] Training Without Orthogonalization, Inference With SVD: A Gradient Analysis of Rotation Representations

[2604.05414] Training Without Orthogonalization, Inference With SVD: A Gradient Analysis of Rotation Representations

arXiv - Machine Learning 4 min read

About this article

Abstract page for arXiv paper 2604.05414: Training Without Orthogonalization, Inference With SVD: A Gradient Analysis of Rotation Representations

Computer Science > Machine Learning arXiv:2604.05414 (cs) [Submitted on 7 Apr 2026] Title:Training Without Orthogonalization, Inference With SVD: A Gradient Analysis of Rotation Representations Authors:Chris Choy View a PDF of the paper titled Training Without Orthogonalization, Inference With SVD: A Gradient Analysis of Rotation Representations, by Chris Choy View PDF HTML (experimental) Abstract:Recent work has shown that removing orthogonalization during training and applying it only at inference improves rotation estimation in deep learning, with empirical evidence favoring 9D representations with SVD projection. However, the theoretical understanding of why SVD orthogonalization specifically harms training, and why it should be preferred over Gram-Schmidt at inference, remains incomplete. We provide a detailed gradient analysis of SVD orthogonalization specialized to $3 \times 3$ matrices and $SO(3)$ projection. Our central result derives the exact spectrum of the SVD backward pass Jacobian: it has rank $3$ (matching the dimension of $SO(3)$) with nonzero singular values $2/(s_i + s_j)$ and condition number $\kappa = (s_1 + s_2)/(s_2 + s_3)$, creating quantifiable gradient distortion that is most severe when the predicted matrix is far from $SO(3)$ (e.g., early in training when $s_3 \approx 0$). We further show that even stabilized SVD gradients introduce gradient direction error, whereas removing SVD from the training loop avoids this tradeoff entirely. We also prove...

Originally published on April 08, 2026. Curated by AI News.

Related Articles

Machine Learning

‘The cost of compute is far beyond the costs of the employees’: Nvidia exec says right now AI is more expensive than paying human workers

Nvidia’s vice president of applied deep learning, Bryan Catanzaro, recently stated that for his team, “the cost of compute is far beyond ...

Reddit - Artificial Intelligence · 1 min ·
UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
Improving AI models’ ability to explain their predictions
Machine Learning

Improving AI models’ ability to explain their predictions

AI News - General · 9 min ·
New technique makes AI models leaner and faster while they’re still learning
Machine Learning

New technique makes AI models leaner and faster while they’re still learning

AI News - General · 9 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime