[2603.20149] Enhancing Hyperspace Analogue to Language (HAL) Representations via Attention-Based Pooling for Text Classification
About this article
Abstract page for arXiv paper 2603.20149: Enhancing Hyperspace Analogue to Language (HAL) Representations via Attention-Based Pooling for Text Classification
Computer Science > Computation and Language arXiv:2603.20149 (cs) [Submitted on 20 Mar 2026] Title:Enhancing Hyperspace Analogue to Language (HAL) Representations via Attention-Based Pooling for Text Classification Authors:Ali Sakour, Zoalfekar Sakour View a PDF of the paper titled Enhancing Hyperspace Analogue to Language (HAL) Representations via Attention-Based Pooling for Text Classification, by Ali Sakour and 1 other authors View PDF HTML (experimental) Abstract:The Hyperspace Analogue to Language (HAL) model relies on global word co-occurrence matrices to construct distributional semantic representations. While these representations capture lexical relationships effectively, aggregating them into sentence-level embeddings via standard mean pooling often results in information loss. Mean pooling assigns equal weight to all tokens, thereby diluting the impact of contextually salient words with uninformative structural tokens. In this paper, we address this limitation by integrating a learnable, temperature-scaled additive attention mechanism into the HAL representation pipeline. To mitigate the sparsity and high dimensionality of the raw co-occurrence matrices, we apply Truncated Singular Value Decomposition (SVD) to project the vectors into a dense latent space prior to the attention layer. We evaluate the proposed architecture on the IMDB sentiment analysis dataset. Empirical results demonstrate that the attention-based pooling approach achieves a test accuracy of 82...