[2601.09220] From Hawkes Processes to Attention: Time-Modulated Mechanisms for Event Sequences
About this article
Abstract page for arXiv paper 2601.09220: From Hawkes Processes to Attention: Time-Modulated Mechanisms for Event Sequences
Computer Science > Machine Learning arXiv:2601.09220 (cs) [Submitted on 14 Jan 2026 (v1), last revised 24 Mar 2026 (this version, v2)] Title:From Hawkes Processes to Attention: Time-Modulated Mechanisms for Event Sequences Authors:Xinzi Tan, Kejian Zhang, Junhan Yu, Doudou Zhou View a PDF of the paper titled From Hawkes Processes to Attention: Time-Modulated Mechanisms for Event Sequences, by Xinzi Tan and 3 other authors View PDF HTML (experimental) Abstract:Marked Temporal Point Processes (MTPPs) arise naturally in medical, social, commercial, and financial domains. However, existing Transformer-based methods mostly inject temporal information only via positional encodings, relying on shared or parametric decay structures, which limits their ability to capture heterogeneous and type-specific temporal effects. Inspired by this observation, we derive a novel attention operator called Hawkes Attention from the multivariate Hawkes process theory for MTPP, using learnable per-type neural kernels to modulate query, key and value projections, thereby replacing the corresponding parts in the traditional attention. Benefited from the design, Hawkes Attention unifies event timing and content interaction, learning both the time-relevant behavior and type-specific excitation patterns from the data. The experimental results show that our method achieves better performance compared to the baselines. In addition to the general MTPP, our attention mechanism can also be easily applied to...