[2502.10600] Weighted quantization using MMD: From mean field to mean shift via gradient flows

[2502.10600] Weighted quantization using MMD: From mean field to mean shift via gradient flows

arXiv - Machine Learning 4 min read

About this article

Abstract page for arXiv paper 2502.10600: Weighted quantization using MMD: From mean field to mean shift via gradient flows

Statistics > Machine Learning arXiv:2502.10600 (stat) [Submitted on 14 Feb 2025 (v1), last revised 1 Apr 2026 (this version, v3)] Title:Weighted quantization using MMD: From mean field to mean shift via gradient flows Authors:Ayoub Belhadji, Daniel Sharp, Youssef Marzouk View a PDF of the paper titled Weighted quantization using MMD: From mean field to mean shift via gradient flows, by Ayoub Belhadji and 2 other authors View PDF Abstract:Approximating a probability distribution using a set of particles is a fundamental problem in machine learning and statistics, with applications including clustering and quantization. Formally, we seek a weighted mixture of Dirac measures that best approximates the target distribution. While much existing work relies on the Wasserstein distance to quantify approximation errors, maximum mean discrepancy (MMD) has received comparatively less attention, especially when allowing for variable particle weights. We argue that a Wasserstein-Fisher-Rao gradient flow is well-suited for designing quantizations optimal under MMD. We show that a system of interacting particles satisfying a set of ODEs discretizes this flow. We further derive a new fixed-point algorithm called mean shift interacting particles (MSIP). We show that MSIP extends the classical mean shift algorithm, widely used for identifying modes in kernel density estimators. Moreover, we show that MSIP can be interpreted as preconditioned gradient descent and that it acts as a relaxation...

Originally published on April 03, 2026. Curated by AI News.

Related Articles

Machine Learning

How do you anonymize code for a conference submission? [D]

Hi everyone, I have a question about anonymizing code for conference submissions. I’m submitting an AI/ML paper to a conference and would...

Reddit - Machine Learning · 1 min ·
Now Meta will track what employees do on their computers to train its AI agents | The Verge
Machine Learning

Now Meta will track what employees do on their computers to train its AI agents | The Verge

Meta is reportedly using tracking software to record its employees’ mouse and keyboard activity for training data for its AI agents.

The Verge - AI · 4 min ·
Llms

Training-time intervention yields 63.4% blind-pair human preference at matched val-loss (1.2B params, 320 judgments, p = 1.98 × 10⁻⁵) [R]

TL;DR. I ran a blind A/B preference evaluation between two 1.2B-parameter LMs trained on identical data (same order, same seed, 30K steps...

Reddit - Machine Learning · 1 min ·
Machine Learning

I can't believe text normalization is so underdiscussed in streaming text-to-speech [D]

Kinda suprises me how little discussion there is around about mistakes in streaming TTS models People look for natural readers, high voic...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime