[2603.22155] RAMPAGE: RAndomized Mid-Point for debiAsed Gradient Extrapolation
About this article
Abstract page for arXiv paper 2603.22155: RAMPAGE: RAndomized Mid-Point for debiAsed Gradient Extrapolation
Computer Science > Machine Learning arXiv:2603.22155 (cs) [Submitted on 23 Mar 2026] Title:RAMPAGE: RAndomized Mid-Point for debiAsed Gradient Extrapolation Authors:Abolfazl Hashemi View a PDF of the paper titled RAMPAGE: RAndomized Mid-Point for debiAsed Gradient Extrapolation, by Abolfazl Hashemi View PDF HTML (experimental) Abstract:A celebrated method for Variational Inequalities (VIs) is Extragradient (EG), which can be viewed as a standard discrete-time integration scheme. With this view in mind, in this paper we show that EG may suffer from discretization bias when applied to non-linear vector fields, conservative or otherwise. To resolve this discretization shortcoming, we introduce RAndomized Mid-Point for debiAsed Gradient Extrapolation (RAMPAGE) and its variance-reduced counterpart, RAMPAGE+ which leverages antithetic sampling. In contrast with EG, both methods are unbiased. Furthermore, leveraging negative correlation, RAMPAGE+ acts as an unbiased, geometric path-integrator that completely removes internal first-order terms from the variance, provably improving upon RAMPAGE. We further demonstrate that both methods enjoy provable $\mathcal{O}(1/k)$ convergence guarantees for a range of problems including root finding under co-coercive, co-hypomonotone, and generalized Lipschitzness regimes. Furthermore, we introduce symmetrically scaled variants to extend our results to constrained VIs. Finally, we provide convergence guarantees of both methods for stochastic a...