[2603.21042] Statistical Learning for Latent Embedding Alignment with Application to Brain Encoding and Decoding

[2603.21042] Statistical Learning for Latent Embedding Alignment with Application to Brain Encoding and Decoding

arXiv - Machine Learning 3 min read

About this article

Abstract page for arXiv paper 2603.21042: Statistical Learning for Latent Embedding Alignment with Application to Brain Encoding and Decoding

Statistics > Methodology arXiv:2603.21042 (stat) [Submitted on 22 Mar 2026] Title:Statistical Learning for Latent Embedding Alignment with Application to Brain Encoding and Decoding Authors:Shuoxun Xu, Zhanhao Yan, Lexin Li View a PDF of the paper titled Statistical Learning for Latent Embedding Alignment with Application to Brain Encoding and Decoding, by Shuoxun Xu and 2 other authors View PDF HTML (experimental) Abstract:Brain encoding and decoding aims to understand the relationship between external stimuli and brain activities, and is a fundamental problem in neuroscience. In this article, we study latent embedding alignment for brain encoding and decoding, with a focus on improving sample efficiency under limited fMRI-stimulus paired data and substantial subject heterogeneity. We propose a lightweight alignment framework equipped with two statistical learning components: inverse semi-supervised learning that leverages abundant unpaired stimulus embeddings through inverse mapping and residual debiasing, and meta transfer learning that borrows strength from pretrained models across subjects via sparse aggregation and residual correction. Both methods operate exclusively at the alignment stage while keeping encoders and decoders frozen, allowing for efficient computation, modular deployment, and rigorous theoretical analysis. We establish finite-sample generalization bounds and safety guarantees, and demonstrate competitive empirical performance on the large-scale fMRI-...

Originally published on March 24, 2026. Curated by AI News.

Related Articles

Machine Learning

[R] First open-source implementation of Hebbian fast-weight write-back for the BDH architecture

The BDH (Dragon Hatchling) paper (arXiv:2509.26507) describes a Hebbian synaptic plasticity mechanism where model weights update during i...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] Could really use some guidance . I'm a 2nd year Data Science UG Student

I'm currently finishing up my second year of a three year Bachelor of Data Science degree. I've got the basics down quite well, linear re...

Reddit - Machine Learning · 1 min ·
Machine Learning

[P] Create datasets from TikTok videos

For ML experiments and RAG projects: Tikkocampus converts creator timelines into timestamped, searchable segments and then use it to perf...

Reddit - Machine Learning · 1 min ·
Memory chip giant SK hynix could help end 'RAMmageddon' with blockbuster US IPO | TechCrunch
Nlp

Memory chip giant SK hynix could help end 'RAMmageddon' with blockbuster US IPO | TechCrunch

SK hynix’s potential U.S. listing could raise $10-$14 billion to help it build more capacity, encourage others to follow, and end the 'RA...

TechCrunch - AI · 6 min ·
More in Nlp: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime