[2603.03355] Inhibitory Cross-Talk Enables Functional Lateralization in Attention-Coupled Latent Memory

[2603.03355] Inhibitory Cross-Talk Enables Functional Lateralization in Attention-Coupled Latent Memory

arXiv - AI 3 min read

About this article

Abstract page for arXiv paper 2603.03355: Inhibitory Cross-Talk Enables Functional Lateralization in Attention-Coupled Latent Memory

Quantitative Biology > Neurons and Cognition arXiv:2603.03355 (q-bio) [Submitted on 27 Feb 2026] Title:Inhibitory Cross-Talk Enables Functional Lateralization in Attention-Coupled Latent Memory Authors:Hong Jeong View a PDF of the paper titled Inhibitory Cross-Talk Enables Functional Lateralization in Attention-Coupled Latent Memory, by Hong Jeong View PDF HTML (experimental) Abstract:We present a memory-augmented transformer in which attention serves simultaneously as a retrieval, consolidation, and write-back operator. The core update, $A^\top A V W$, re-grounds retrieved values into persistent memory slots via the Gram matrix $A^\top A$, providing a principled tripartite projection: observation space $\to$ latent memory $\to$ supervised transformation. We partition the memory into lateralized left and right banks coupled through a sign-controlled cross-talk matrix $W_s$, and show that the sign of this coupling is decisive for specialization. Excitatory cross-talk ($s=+1$) causes bank-dominance collapse: one bank monopolises all inputs and $\mathcal{P}_{ct} \to 0.5$, despite lowering task loss. Inhibitory cross-talk ($s=-1$), motivated by the net inhibitory effect of callosal projections in human cortex, actively suppresses contralateral bank activation and achieves saturated specialization ($\mathcal{D}_{sep} = \pm 1.00$, $\mathcal{P}_{ct} \approx 0$). On a controlled symbolic benchmark combining an episodic bijection cipher (requiring associative recall) with a strict ...

Originally published on March 05, 2026. Curated by AI News.

Related Articles

Machine Learning

[R] Are there ML approaches for prioritizing and routing “important” signals across complex systems?

I’ve been reading more about attention mechanisms in transformers and how they effectively learn to weight and prioritize relevant inputs...

Reddit - Machine Learning · 1 min ·
Llms

[P] I trained a language model from scratch for a low resource language and got it running fully on-device on Android (no GPU, demo)

Hi Everybody! I just wanted to share an update on a project I’ve been working on called BULaMU, a family of language models trained (20M,...

Reddit - Machine Learning · 1 min ·
Machine Learning

[R] Structure Over Scale: Memory-First Reasoning and Depth-Pruned Efficiency in Magnus and Seed Architecture Auto-Discovery

Dataset Model Acc F1 Δ vs Log Δ vs Static Avg Params Peak Params Steps Infer ms Size Banking77-20 Logistic TF-IDF 92.37% 0.9230 +0.00pp +...

Reddit - Machine Learning · 1 min ·
UM Computer Scientists Land Grant to Improve Models of Melting Greenland Glaciers
Machine Learning

UM Computer Scientists Land Grant to Improve Models of Melting Greenland Glaciers

Two UM researchers are using advanced neural networks, machine learning and artificial intelligence to improve climate models to better p...

AI News - General · 5 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime