[2602.23409] Long Range Frequency Tuning for QML
About this article
Abstract page for arXiv paper 2602.23409: Long Range Frequency Tuning for QML
Computer Science > Machine Learning arXiv:2602.23409 (cs) [Submitted on 26 Feb 2026] Title:Long Range Frequency Tuning for QML Authors:Michael Poppel, Jonas Stein, Sebastian Wölckert, Markus Baumann, Claudia Linnhoff-Popien View a PDF of the paper titled Long Range Frequency Tuning for QML, by Michael Poppel and 4 other authors View PDF HTML (experimental) Abstract:Quantum machine learning models using angle encoding naturally represent truncated Fourier series, providing universal function approximation capabilities with sufficient circuit depth. For unary fixed-frequency encodings, circuit depth scales as O(omega_max * (omega_max + epsilon^{-2})) with target frequency magnitude omega_max and precision epsilon. Trainable-frequency approaches theoretically reduce this to match the target spectrum size, requiring only as many encoding gates as frequencies in the target spectrum. Despite this compelling efficiency, their practical effectiveness hinges on a key assumption: that gradient-based optimization can drive prefactors to arbitrary target values. We demonstrate through systematic experiments that frequency prefactors exhibit limited trainability: movement is constrained to approximately +/-1 units with typical learning rates. When target frequencies lie outside this reachable range, optimization frequently fails. To overcome this frequency reachability limitation, we propose grid-based initialization using ternary encodings, which generate dense integer frequency spect...