[2509.24335] Hyperspherical Latents Improve Continuous-Token Autoregressive Generation

[2509.24335] Hyperspherical Latents Improve Continuous-Token Autoregressive Generation

arXiv - Machine Learning 4 min read

About this article

Abstract page for arXiv paper 2509.24335: Hyperspherical Latents Improve Continuous-Token Autoregressive Generation

Computer Science > Computer Vision and Pattern Recognition arXiv:2509.24335 (cs) [Submitted on 29 Sep 2025 (v1), last revised 5 Mar 2026 (this version, v2)] Title:Hyperspherical Latents Improve Continuous-Token Autoregressive Generation Authors:Guolin Ke, Hui Xue View a PDF of the paper titled Hyperspherical Latents Improve Continuous-Token Autoregressive Generation, by Guolin Ke and 1 other authors View PDF HTML (experimental) Abstract:Autoregressive (AR) models are promising for image generation, yet continuous-token AR variants often trail latent diffusion and masked-generation models. The core issue is heterogeneous variance in VAE latents, which is amplified during AR decoding, especially under classifier-free guidance (CFG), and can cause variance collapse. We propose SphereAR to address this issue. Its core design is to constrain all AR inputs and outputs -- including after CFG -- to lie on a fixed-radius hypersphere (constant $\ell_2$ norm), leveraging hyperspherical VAEs. Our theoretical analysis shows that hyperspherical constraint removes the scale component (the primary cause of variance collapse), thereby stabilizing AR decoding. Empirically, on ImageNet generation, SphereAR-H (943M) sets a new state of the art for AR models, achieving FID 1.34. Even at smaller scales, SphereAR-L (479M) reaches FID 1.54 and SphereAR-B (208M) reaches 1.92, matching or surpassing much larger baselines such as MAR-H (943M, 1.55) and VAR-d30 (2B, 1.92). To our knowledge, this is t...

Originally published on March 06, 2026. Curated by AI News.

Related Articles

UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
Llms

ChatGPT Critiques My Approach to AI

I uploaded VulcanAMI into ChatGPT and had it to a deep analysis. I then asked one simple question: What would be the result of wider adop...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

I have created a biologically based AI model

I've spent the last year building NIMCP — a biologically-inspired artificial brain in C that trains six different neural network types si...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

[D] Thinking about augmentation as invariance assumptions

Data augmentation is still used much more heuristically than it should be. A training pipeline can easily turn into a stack of intuition,...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime