[2603.02874] Retrievit: In-context Retrieval Capabilities of Transformers, State Space Models, and Hybrid Architectures

[2603.02874] Retrievit: In-context Retrieval Capabilities of Transformers, State Space Models, and Hybrid Architectures

arXiv - AI 4 min read

About this article

Abstract page for arXiv paper 2603.02874: Retrievit: In-context Retrieval Capabilities of Transformers, State Space Models, and Hybrid Architectures

Computer Science > Artificial Intelligence arXiv:2603.02874 (cs) [Submitted on 3 Mar 2026] Title:Retrievit: In-context Retrieval Capabilities of Transformers, State Space Models, and Hybrid Architectures Authors:Georgios Pantazopoulos, Malvina Nikandrou, Ioannis Konstas, Alessandro Suglia View a PDF of the paper titled Retrievit: In-context Retrieval Capabilities of Transformers, State Space Models, and Hybrid Architectures, by Georgios Pantazopoulos and 3 other authors View PDF HTML (experimental) Abstract:Transformers excel at in-context retrieval but suffer from quadratic complexity with sequence length, while State Space Models (SSMs) offer efficient linear-time processing but have limited retrieval capabilities. We investigate whether hybrid architectures combining Transformers and SSMs can achieve the best of both worlds on two synthetic in-context retrieval tasks. The first task, n-gram retrieval, requires the model to identify and reproduce an n-gram that succeeds the query within the input sequence. The second task, position retrieval, presents the model with a single query token and requires it to perform a two-hop associative lookup: first locating the corresponding element in the sequence, and then outputting its positional index. Under controlled experimental conditions, we assess data efficiency, length generalization, robustness to out of domain training examples, and learned representations across Transformers, SSMs, and hybrid architectures. We find that h...

Originally published on March 04, 2026. Curated by AI News.

Related Articles

Yupp shuts down after raising $33M from a16z crypto's Chris Dixon | TechCrunch
Machine Learning

Yupp shuts down after raising $33M from a16z crypto's Chris Dixon | TechCrunch

Less than a year after launching, with checks from some of the biggest names in Silicon Valley, crowdsourced AI model feedback startup Yu...

TechCrunch - AI · 4 min ·
Machine Learning

[R] Fine-tuning services report

If you have some data and want to train or run a small custom model but don't have powerful enough hardware for training, fine-tuning ser...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] Does ML have a "bible"/reference textbook at the Intermediate/Advanced level?

Hello, everyone! This is my first time posting here and I apologise if the question is, perhaps, a bit too basic for this sub-reddit. A b...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] ICML 2026 review policy debate: 100 responses suggest Policy B may score higher, while Policy A shows higher confidence

A week ago I made a thread asking whether ICML 2026’s review policy might have affected review outcomes, especially whether Policy A pape...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime