[1602.05350] Relative Error Embeddings for the Gaussian Kernel Distance
About this article
Abstract page for arXiv paper 1602.05350: Relative Error Embeddings for the Gaussian Kernel Distance
Computer Science > Machine Learning arXiv:1602.05350 (cs) [Submitted on 17 Feb 2016 (v1), last revised 22 Mar 2026 (this version, v3)] Title:Relative Error Embeddings for the Gaussian Kernel Distance Authors:Di Chen, Jeff M. Phillips View a PDF of the paper titled Relative Error Embeddings for the Gaussian Kernel Distance, by Di Chen and Jeff M. Phillips View PDF HTML (experimental) Abstract:A reproducing kernel can define an embedding of a data point into an infinite dimensional reproducing kernel Hilbert space (RKHS). The norm in this space describes a distance, which we call the kernel distance. The random Fourier features (of Rahimi and Recht) describe an oblivious approximate mapping into finite dimensional Euclidean space that behaves similar to the RKHS. We show in this paper that for the Gaussian kernel the Euclidean norm between these mapped to features has $(1+\varepsilon)$-relative error with respect to the kernel distance. When there are $n$ data points, we show that $O((1/\varepsilon^2) \log(n))$ dimensions of the approximate feature space are sufficient and necessary. Without a bound on $n$, but when the original points lie in $\mathbb{R}^d$ and have diameter bounded by $\mathcal{M}$, then we show that $O((d/\varepsilon^2) \log(\mathcal{M}))$ dimensions are sufficient, and that this many are required, up to $\log(1/\varepsilon)$ factors. Comments: Subjects: Machine Learning (cs.LG) Cite as: arXiv:1602.05350 [cs.LG] (or arXiv:1602.05350v3 [cs.LG] for this ve...