[2512.10411] SWAA: Sliding Window Attention Adaptation for Efficient and Quality Preserving Long Context Processing

[2512.10411] SWAA: Sliding Window Attention Adaptation for Efficient and Quality Preserving Long Context Processing

arXiv - AI 4 min read

About this article

Abstract page for arXiv paper 2512.10411: SWAA: Sliding Window Attention Adaptation for Efficient and Quality Preserving Long Context Processing

Computer Science > Computation and Language arXiv:2512.10411 (cs) [Submitted on 11 Dec 2025 (v1), last revised 26 Mar 2026 (this version, v5)] Title:SWAA: Sliding Window Attention Adaptation for Efficient and Quality Preserving Long Context Processing Authors:Yijiong Yu, Jiale Liu, Qingyun Wu, Huazheng Wang, Ji Pei View a PDF of the paper titled SWAA: Sliding Window Attention Adaptation for Efficient and Quality Preserving Long Context Processing, by Yijiong Yu and 4 other authors View PDF HTML (experimental) Abstract:The quadratic complexity of self attention in Transformer based LLMs renders long context inference prohibitively expensive. While Sliding Window Attention (SWA), the simplest sparse attention pattern, offers a linear complexity alternative, it suffers from catastrophic long context performance collapse, which stems from two fundamental factors: the training inference mismatch when naively applying SWA to models pretrained with Full Attention (FA), and the inherent structural inability to access distant information when applying SWA to every module at all times. To address these dual challenges, we propose Sliding Window Attention Adaptation (SWAA), a plug and play toolkit of recipes that adapts FA models to SWA without costly pretraining. SWAA systematically combines four core strategies to tackle these distinct issues: (1) Full Attention (FA) Decode and (2) Interleaving FA and SWA layers, which mitigate structural defects by selectively allowing access to d...

Originally published on March 27, 2026. Curated by AI News.

Related Articles

I Asked ChatGPT 500 Questions. Here Are the Ads I Saw Most Often | WIRED
Llms

I Asked ChatGPT 500 Questions. Here Are the Ads I Saw Most Often | WIRED

Ads are rolling out across the US on ChatGPT’s free tier. I asked OpenAI's bot 500 questions to see what these ads were like and how they...

Wired - AI · 9 min ·
Llms

Abacus.Ai Claw LLM consumes an incredible amount of credit without any usage :(

Three days ago, I clicked the "Deploy OpenClaw In Seconds" button to get an overview of the new service, but I didn't build any automatio...

Reddit - Artificial Intelligence · 1 min ·
Google’s Gemini AI app debuts in Hong Kong
Llms

Google’s Gemini AI app debuts in Hong Kong

Tech giant’s chatbot service tops Apple’s app store chart in the city.

AI Tools & Products · 2 min ·
Google Launches Gemini Import Tools to Poach Users From Rival AI Apps
Llms

Google Launches Gemini Import Tools to Poach Users From Rival AI Apps

Anyone looking to switch their AI assistant will find it surprisingly easy, as it only takes a few steps to move from A to B. This is not...

AI Tools & Products · 4 min ·
More in Llms: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime