[2602.17287] Representation Collapse in Machine Translation Through the Lens of Angular Dispersion

[2602.17287] Representation Collapse in Machine Translation Through the Lens of Angular Dispersion

arXiv - Machine Learning 3 min read Article

Summary

This paper explores representation collapse in neural machine translation models, particularly focusing on the Transformer architecture and the impact of angular dispersion regularization on translation quality.

Why It Matters

Understanding representation collapse is crucial for improving the performance of machine translation systems. This research highlights a common issue in neural models and presents a solution that enhances translation quality, making it relevant for developers and researchers in NLP and AI.

Key Takeaways

  • Representation collapse can significantly affect translation quality in neural machine translation models.
  • Angular dispersion regularization helps mitigate representation collapse, leading to improved performance.
  • The study shows that quantized models also exhibit representation collapse, emphasizing the need for regularization in various contexts.

Computer Science > Computation and Language arXiv:2602.17287 (cs) [Submitted on 19 Feb 2026] Title:Representation Collapse in Machine Translation Through the Lens of Angular Dispersion Authors:Evgeniia Tokarchuk, Maya K. Nachesa, Sergey Troshin, Vlad Niculae View a PDF of the paper titled Representation Collapse in Machine Translation Through the Lens of Angular Dispersion, by Evgeniia Tokarchuk and 3 other authors View PDF Abstract:Modern neural translation models based on the Transformer architecture are known for their high performance, particularly when trained on high-resource datasets. A standard next-token prediction training strategy, while widely adopted in practice, may lead to overlooked artifacts such as representation collapse. Previous works have shown that this problem is especially pronounced in the representation of the deeper Transformer layers, where it often fails to efficiently utilize the geometric space. Representation collapse is even more evident in end-to-end training of continuous-output neural machine translation, where the trivial solution would be to set all vectors to the same value. In this work, we analyze the dynamics of representation collapse at different levels of discrete and continuous NMT transformers throughout training. We incorporate an existing regularization method based on angular dispersion and demonstrate empirically that it not only mitigates collapse but also improves translation quality. Furthermore, we show that quantized...

Related Articles

UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
Machine Learning

[D] Budget Machine Learning Hardware

Looking to get into machine learning and found this video on a piece of hardware for less than £500. Is it really possible to teach auton...

Reddit - Machine Learning · 1 min ·
Machine Learning

Your prompts aren’t the problem — something else is

I keep seeing people focus heavily on prompt optimization. But in practice, a lot of failures I’ve observed don’t come from the prompt it...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

[R], 31 MILLIONS High frequency data, Light GBM worked perfectly

We just published a paper on predicting adverse selection in high-frequency crypto markets using LightGBM, and I wanted to share it here ...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime