[2603.03830] Large-Margin Hyperdimensional Computing: A Learning-Theoretical Perspective
About this article
Abstract page for arXiv paper 2603.03830: Large-Margin Hyperdimensional Computing: A Learning-Theoretical Perspective
Computer Science > Machine Learning arXiv:2603.03830 (cs) [Submitted on 4 Mar 2026] Title:Large-Margin Hyperdimensional Computing: A Learning-Theoretical Perspective Authors:Nikita Zeulin, Olga Galinina, Ravikumar Balakrishnan, Nageen Himayat, Sergey Andreev View a PDF of the paper titled Large-Margin Hyperdimensional Computing: A Learning-Theoretical Perspective, by Nikita Zeulin and 4 other authors View PDF HTML (experimental) Abstract:Overparameterized machine learning (ML) methods such as neural networks may be prohibitively resource intensive for devices with limited computational capabilities. Hyperdimensional computing (HDC) is an emerging resource efficient and low-complexity ML method that allows hardware efficient implementations of (re-)training and inference procedures. In this paper, we propose a maximum-margin HDC classifier, which significantly outperforms baseline HDC methods on several benchmark datasets. Our method leverages a formal relation between HDC and support vector machines (SVMs) that we established for the first time. Our findings may inspire novel HDC methods with potentially more hardware-oriented implementations compared to SVMs, thus enabling more efficient learning solutions for various intelligent resource-constrained applications. Comments: Subjects: Machine Learning (cs.LG) Cite as: arXiv:2603.03830 [cs.LG] (or arXiv:2603.03830v1 [cs.LG] for this version) https://doi.org/10.48550/arXiv.2603.03830 Focus to learn more arXiv-issued DOI ...