[2602.17287] Representation Collapse in Machine Translation Through the Lens of Angular Dispersion
Summary
This paper explores representation collapse in neural machine translation models, particularly focusing on the Transformer architecture and the impact of angular dispersion regularization on translation quality.
Why It Matters
Understanding representation collapse is crucial for improving the performance of machine translation systems. This research highlights a common issue in neural models and presents a solution that enhances translation quality, making it relevant for developers and researchers in NLP and AI.
Key Takeaways
- Representation collapse can significantly affect translation quality in neural machine translation models.
- Angular dispersion regularization helps mitigate representation collapse, leading to improved performance.
- The study shows that quantized models also exhibit representation collapse, emphasizing the need for regularization in various contexts.
Computer Science > Computation and Language arXiv:2602.17287 (cs) [Submitted on 19 Feb 2026] Title:Representation Collapse in Machine Translation Through the Lens of Angular Dispersion Authors:Evgeniia Tokarchuk, Maya K. Nachesa, Sergey Troshin, Vlad Niculae View a PDF of the paper titled Representation Collapse in Machine Translation Through the Lens of Angular Dispersion, by Evgeniia Tokarchuk and 3 other authors View PDF Abstract:Modern neural translation models based on the Transformer architecture are known for their high performance, particularly when trained on high-resource datasets. A standard next-token prediction training strategy, while widely adopted in practice, may lead to overlooked artifacts such as representation collapse. Previous works have shown that this problem is especially pronounced in the representation of the deeper Transformer layers, where it often fails to efficiently utilize the geometric space. Representation collapse is even more evident in end-to-end training of continuous-output neural machine translation, where the trivial solution would be to set all vectors to the same value. In this work, we analyze the dynamics of representation collapse at different levels of discrete and continuous NMT transformers throughout training. We incorporate an existing regularization method based on angular dispersion and demonstrate empirically that it not only mitigates collapse but also improves translation quality. Furthermore, we show that quantized...