[2602.04728] Scalable Cross-Attention Transformer for Cooperative Multi-AP OFDM Uplink Reception
About this article
Abstract page for arXiv paper 2602.04728: Scalable Cross-Attention Transformer for Cooperative Multi-AP OFDM Uplink Reception
Electrical Engineering and Systems Science > Signal Processing arXiv:2602.04728 (eess) [Submitted on 4 Feb 2026 (v1), last revised 7 Apr 2026 (this version, v2)] Title:Scalable Cross-Attention Transformer for Cooperative Multi-AP OFDM Uplink Reception Authors:Xavier Tardy, Grégoire Lefebvre, Apostolos Kountouris, Haïfa Fares, Amor Nafkha View a PDF of the paper titled Scalable Cross-Attention Transformer for Cooperative Multi-AP OFDM Uplink Reception, by Xavier Tardy and 4 other authors View PDF HTML (experimental) Abstract:We propose a cross-attention Transformer for joint decoding of uplink OFDM signals received by multiple coordinated access points. A shared per-receiver encoder learns the time-frequency structure of each grid, and a token-wise cross-attention module fuses the receivers to produce soft log-likelihood ratios for a standard channel decoder without explicit channel estimates. Trained with a bit-metric objective, the model adapts its fusion to per-receiver reliability and remains robust under degraded links, strong frequency selectivity, and sparse pilots. Over realistic Wi-Fi channels, it outperforms classical pipelines and strong neural baselines, often matching or surpassing a local perfect-CSI reference while remaining compact and computationally efficient on commodity hardware, making it suitable for next-generation coordinated Wi-Fi receivers. Comments: Subjects: Signal Processing (eess.SP); Information Theory (cs.IT); Machine Learning (cs.LG) MSC cla...