[2604.02121] Gradient estimators for parameter inference in discrete stochastic kinetic models
About this article
Abstract page for arXiv paper 2604.02121: Gradient estimators for parameter inference in discrete stochastic kinetic models
Physics > Computational Physics arXiv:2604.02121 (physics) [Submitted on 2 Apr 2026] Title:Gradient estimators for parameter inference in discrete stochastic kinetic models Authors:Ludwig Burger, Annalena Kofler, Lukas Heinrich, Ulrich Gerland View a PDF of the paper titled Gradient estimators for parameter inference in discrete stochastic kinetic models, by Ludwig Burger and Annalena Kofler and Lukas Heinrich and Ulrich Gerland View PDF HTML (experimental) Abstract:Stochastic kinetic models are ubiquitous in physics, yet inferring their parameters from experimental data remains challenging. In deterministic models, parameter inference often relies on gradients, as they can be obtained efficiently through automatic differentiation. However, these tools cannot be directly applied to stochastic simulation algorithms (SSA) such as the Gillespie algorithm, since sampling from a discrete set of reactions introduces non-differentiable operations. In this work, we adopt three gradient estimators from machine learning for the Gillespie SSA: the Gumbel-Softmax Straight-Through (GS-ST) estimator, the Score Function estimator, and the Alternative Path estimator. We compare the properties of all estimators in two representative systems exhibiting relaxation or oscillatory dynamics, where the latter requires gradient estimation of time-dependent objective functions. We find that the GS-ST estimator mostly yields well-behaved gradient estimates, but exhibits diverging variance in challe...