[2509.24762] In-Context Learning of Temporal Point Processes with Foundation Inference Models
About this article
Abstract page for arXiv paper 2509.24762: In-Context Learning of Temporal Point Processes with Foundation Inference Models
Computer Science > Machine Learning arXiv:2509.24762 (cs) [Submitted on 29 Sep 2025 (v1), last revised 27 Feb 2026 (this version, v2)] Title:In-Context Learning of Temporal Point Processes with Foundation Inference Models Authors:David Berghaus, Patrick Seifner, Kostadin Cvejoski, César Ojeda, Ramsés J. Sánchez View a PDF of the paper titled In-Context Learning of Temporal Point Processes with Foundation Inference Models, by David Berghaus and 4 other authors View PDF HTML (experimental) Abstract:Modeling event sequences of multiple event types with marked temporal point processes (MTPPs) provides a principled way to uncover governing dynamical rules and predict future events. Current neural network approaches to MTPP inference rely on training separate, specialized models for each target system. We pursue a radically different approach: drawing on amortized inference and in-context learning, we pretrain a deep neural network to infer, in-context, the conditional intensity functions of event histories from a context defined by sets of event sequences. Pretraining is performed on a large synthetic dataset of MTPPs sampled from a broad distribution of Hawkes processes. Once pretrained, our Foundation Inference Model for Point Processes (FIM-PP) can estimate MTPPs from real-world data without any additional training, or be rapidly finetuned to target systems. Experiments show that this amortized approach matches the performance of specialized models on next-event prediction a...