[1602.05350] Relative Error Embeddings for the Gaussian Kernel Distance
Nlp

[1602.05350] Relative Error Embeddings for the Gaussian Kernel Distance

arXiv - Machine Learning 4 min read

About this article

Abstract page for arXiv paper 1602.05350: Relative Error Embeddings for the Gaussian Kernel Distance

Computer Science > Machine Learning arXiv:1602.05350 (cs) [Submitted on 17 Feb 2016 (v1), last revised 22 Mar 2026 (this version, v3)] Title:Relative Error Embeddings for the Gaussian Kernel Distance Authors:Di Chen, Jeff M. Phillips View a PDF of the paper titled Relative Error Embeddings for the Gaussian Kernel Distance, by Di Chen and Jeff M. Phillips View PDF HTML (experimental) Abstract:A reproducing kernel can define an embedding of a data point into an infinite dimensional reproducing kernel Hilbert space (RKHS). The norm in this space describes a distance, which we call the kernel distance. The random Fourier features (of Rahimi and Recht) describe an oblivious approximate mapping into finite dimensional Euclidean space that behaves similar to the RKHS. We show in this paper that for the Gaussian kernel the Euclidean norm between these mapped to features has $(1+\varepsilon)$-relative error with respect to the kernel distance. When there are $n$ data points, we show that $O((1/\varepsilon^2) \log(n))$ dimensions of the approximate feature space are sufficient and necessary. Without a bound on $n$, but when the original points lie in $\mathbb{R}^d$ and have diameter bounded by $\mathcal{M}$, then we show that $O((d/\varepsilon^2) \log(\mathcal{M}))$ dimensions are sufficient, and that this many are required, up to $\log(1/\varepsilon)$ factors. Comments: Subjects: Machine Learning (cs.LG) Cite as: arXiv:1602.05350 [cs.LG]   (or arXiv:1602.05350v3 [cs.LG] for this ve...

Originally published on March 24, 2026. Curated by AI News.

Related Articles

Machine Learning

VulcanAMI Might Help

I open-sourced a large AI platform I built solo, working 16 hours a day, at my kitchen table, fueled by an inordinate degree of compulsio...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

[P] I tested Meta’s brain-response model on posts. It predicted the Elon one almost perfectly.

I built an experimental UI and visualization layer around Meta’s open brain-response model just to see whether this stuff actually works ...

Reddit - Machine Learning · 1 min ·
Machine Learning

[R] First open-source implementation of Hebbian fast-weight write-back for the BDH architecture

The BDH (Dragon Hatchling) paper (arXiv:2509.26507) describes a Hebbian synaptic plasticity mechanism where model weights update during i...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] Could really use some guidance . I'm a 2nd year Data Science UG Student

I'm currently finishing up my second year of a three year Bachelor of Data Science degree. I've got the basics down quite well, linear re...

Reddit - Machine Learning · 1 min ·
More in Nlp: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime