[2506.07578] Denoising the Future: Top-p Distributions for Moving Through Time

[2506.07578] Denoising the Future: Top-p Distributions for Moving Through Time

arXiv - AI 4 min read

About this article

Abstract page for arXiv paper 2506.07578: Denoising the Future: Top-p Distributions for Moving Through Time

Computer Science > Machine Learning arXiv:2506.07578 (cs) [Submitted on 9 Jun 2025 (v1), last revised 31 Mar 2026 (this version, v4)] Title:Denoising the Future: Top-p Distributions for Moving Through Time Authors:Florian Andreas Marwitz, Ralf Möller, Magnus Bender, Marcel Gehrke View a PDF of the paper titled Denoising the Future: Top-p Distributions for Moving Through Time, by Florian Andreas Marwitz and 3 other authors View PDF HTML (experimental) Abstract:Inference in dynamic probabilistic models is a complex task involving expensive operations. In particular, for Hidden Markov Models, the whole state space has to be enumerated for advancing in time. Even states with negligible probabilities are considered, resulting in computational inefficiency and possibly increased noise due to the propagation of unlikely probability mass. We propose to denoise the future and speed up inference by using only the top-p transitions, i.e., the most probable transitions with accumulated probability p. We show that the error introduced by using only the top-p transitions is bound by $p$ and the so-called minimal mixing rate of the underlying model. We also show the same bound when using only the top-p states, which is the same, just for the states. Moreover, in our empirical evaluation, we show that we can, when using top-p transitions, expect speedups of at least an order of magnitude, while the error in terms of total variation distance is below 0.09. Using the top-p states is slower ...

Originally published on April 01, 2026. Curated by AI News.

Related Articles

Machine Learning

easyaligner: Forced alignment with GPU acceleration and flexible text normalization (compatible with all w2v2 models on HF Hub) [P]

https://preview.redd.it/f4d5krhkjyvg1.png?width=1020&format=png&auto=webp&s=11310f377b22abbe3dd110cc7d362ba8aae35f8d I have b...

Reddit - Machine Learning · 1 min ·
Machine Learning

ICML 2026 - Heavy score variance among various batches? [D]

I've seen some people say in their batch very few papers have above 3.5 score, but then other reviewers say that most papers in their sco...

Reddit - Machine Learning · 1 min ·
Machine Learning

We’re proud to open-source LIDARLearn [R] [D] [P]

It’s a unified PyTorch library for 3D point cloud deep learning. To our knowledge, it’s the first framework that supports such a large co...

Reddit - Machine Learning · 1 min ·
Llms

I built a repo for implementing and training LLM architectures from scratch in minimal PyTorch — contributions welcome! [P]

Hey everyone, I've been working on a repo where I implement large language model architectures using the simplest PyTorch code possible. ...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime