[2602.23006] Regular Fourier Features for Nonstationary Gaussian Processes
Summary
The paper presents a novel approach using regular Fourier features to simulate nonstationary Gaussian processes, addressing limitations of existing methods in spectral density interpretation.
Why It Matters
This research is significant as it enhances the efficiency of simulating Gaussian processes, particularly in nonstationary contexts, which is crucial for various applications in machine learning and statistics. By improving kernel learning from data, it opens new avenues for modeling complex processes in real-world scenarios.
Key Takeaways
- Introduces regular Fourier features for nonstationary Gaussian processes.
- Addresses limitations of spectral methods in nonstationary contexts.
- Provides an efficient low-rank approximation that maintains positive semi-definiteness.
- Extends to kernel learning when the spectral density is unknown.
- Demonstrates effectiveness on locally stationary and harmonizable mixture kernels.
Statistics > Machine Learning arXiv:2602.23006 (stat) [Submitted on 26 Feb 2026] Title:Regular Fourier Features for Nonstationary Gaussian Processes Authors:Arsalan Jawaid, Abdullah Karatas, Jörg Seewig View a PDF of the paper titled Regular Fourier Features for Nonstationary Gaussian Processes, by Arsalan Jawaid and 2 other authors View PDF HTML (experimental) Abstract:Simulating a Gaussian process requires sampling from a high-dimensional Gaussian distribution, which scales cubically with the number of sample locations. Spectral methods address this challenge by exploiting the Fourier representation, treating the spectral density as a probability distribution for Monte Carlo approximation. Although this probabilistic interpretation works for stationary processes, it is overly restrictive for the nonstationary case, where spectral densities are generally not probability measures. We propose regular Fourier features for harmonizable processes that avoid this limitation. Our method discretizes the spectral representation directly, preserving the correlation structure among spectral weights without requiring probability assumptions. Under a finite spectral support assumption, this yields an efficient low-rank approximation that is positive semi-definite by construction. When the spectral density is unknown, the framework extends naturally to kernel learning from data. We demonstrate the method on locally stationary kernels and on harmonizable mixture kernels with complex-val...