[2603.16513] FEAT: A Linear-Complexity Foundation Model for Extremely Large Structured Data
About this article
Abstract page for arXiv paper 2603.16513: FEAT: A Linear-Complexity Foundation Model for Extremely Large Structured Data
Computer Science > Machine Learning arXiv:2603.16513 (cs) [Submitted on 17 Mar 2026 (v1), last revised 20 Mar 2026 (this version, v2)] Title:FEAT: A Linear-Complexity Foundation Model for Extremely Large Structured Data Authors:Zhenghang Song, Tang Qian, Lu Chen, Yushuai Li, Zhengke Hu, Bingbing Fang, Yumeng Song, Junbo Zhao, Sheng Zhang, Tianyi Li View a PDF of the paper titled FEAT: A Linear-Complexity Foundation Model for Extremely Large Structured Data, by Zhenghang Song and 9 other authors View PDF HTML (experimental) Abstract:Structured data is foundational to healthcare, finance, e-commerce, and scientific data management. Large structured-data models (LDMs) extend the foundation model paradigm to unify heterogeneous datasets for tasks such as classification, regression, and decision support. However, existing LDMs face major limitations. First, most rely on sample-wise self-attention, whose O(N^2) complexity limits the sample count. Second, linear sequence models often degrade representations due to hidden-state compression and artificial causal bias. Third, synthetic-only pre-training often fails to match real-world distributions. We propose FEAT, a linear-complexity foundation model for extremely large structured data. FEAT introduces a multi-layer dual-axis architecture that replaces quadratic attention with hybrid linear encoding. The architecture combines adaptive-fusion bi-Mamba-2 (AFBM) for local sample dependencies and convolutional gated linear attention (...