[2604.00199] QUEST: A robust attention formulation using query-modulated spherical attention

[2604.00199] QUEST: A robust attention formulation using query-modulated spherical attention

arXiv - AI 4 min read

About this article

Abstract page for arXiv paper 2604.00199: QUEST: A robust attention formulation using query-modulated spherical attention

Computer Science > Machine Learning arXiv:2604.00199 (cs) [Submitted on 31 Mar 2026] Title:QUEST: A robust attention formulation using query-modulated spherical attention Authors:Hariprasath Govindarajan, Per Sidén, Jacob Roll, Fredrik Lindsten View a PDF of the paper titled QUEST: A robust attention formulation using query-modulated spherical attention, by Hariprasath Govindarajan and 3 other authors View PDF HTML (experimental) Abstract:The Transformer model architecture has become one of the most widely used in deep learning and the attention mechanism is at its core. The standard attention formulation uses a softmax operation applied to a scaled dot product between query and key vectors. We explore the role played by norms of the queries and keys, which can cause training instabilities when they arbitrarily increase. We demonstrate how this can happen even in simple Transformer models, in the presence of easy-to-learn spurious patterns in the data. We propose a new attention formulation, QUEry-modulated Spherical aTtention (QUEST), that constrains the keys to a hyperspherical latent space, while still allowing individual tokens to flexibly control the sharpness of the attention distribution. QUEST can be easily used as a drop-in replacement for standard attention. We focus on vision applications while also exploring other domains to highlight the method's generality. We show that (1) QUEST trains without instabilities and (2) produces models with improved performance (...

Originally published on April 02, 2026. Curated by AI News.

Related Articles

I can't help rooting for tiny open source AI model maker Arcee | TechCrunch
Llms

I can't help rooting for tiny open source AI model maker Arcee | TechCrunch

Arcee is a tiny 26-person U.S. startup that built a high-performing, massive, open source LLM. And it's gaining popularity with OpenClaw ...

TechCrunch - AI · 4 min ·
Machine Learning

We have an AI agent fragmentation problem

Every AI agent works fine on its own — but the moment you try to use more than one, everything falls apart. Different runtimes. Different...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

Using AI properly

AI is a tool. Period. I spent decades asking forums for help in writing HTML code for my website. I wanted my posts to self-scroll to a p...

Reddit - Artificial Intelligence · 1 min ·
Anthropic Teams Up With Its Rivals to Keep AI From Hacking Everything | WIRED
Llms

Anthropic Teams Up With Its Rivals to Keep AI From Hacking Everything | WIRED

The AI lab's Project Glasswing will bring together Apple, Google, and more than 45 other organizations. They'll use the new Claude Mythos...

Wired - AI · 7 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime