On the analysis of variance-reduced and randomized projection variants of single projection schemes for monotone stochastic variational inequality problems
From MaRDI portal
Publication:2045192
Abstract: Classical extragradient schemes and their stochastic counterpart represent a cornerstone for resolving monotone variational inequality problems. Yet, such schemes have a per-iteration complexity of two projections onto a convex set and require two evaluations of the map, the former of which could be relatively expensive if is a complicated set. We consider two related avenues where the per-iteration complexity is significantly reduced: (i) A stochastic projected reflected gradient method requiring a single evaluation of the map and a single projection; and (ii) A stochastic subgradient extragradient method that requires two evaluations of the map, a single projection onto , and a significantly cheaper projection (onto a halfspace) computable in closed form. Under a variance-reduced framework reliant on a sample-average of the map based on an increasing batch-size, we prove almost sure (a.s.) convergence of the iterates to a random point in the solution set for both schemes. Additionally, both schemes display a non-asymptotic rate of where denotes the number of iterations; notably, both rates match those obtained in deterministic regimes. To address feasibility sets given by the intersection of a large number of convex constraints, we adapt both of the aforementioned schemes to a random projection framework. We then show that the random projection analogs of both schemes also display a.s. convergence under a weak-sharpness requirement; furthermore, without imposing the weak-sharpness requirement, both schemes are characterized by a provable rate of in terms of the gap function of the projection of the averaged sequence onto as well as the infeasibility of this sequence. Preliminary numerics support theoretical findings and the schemes outperform standard extragradient schemes in terms of the per-iteration complexity.
Recommendations
- Extragradient Method with Variance Reduction for Stochastic Variational Inequalities
- Optimal stochastic extragradient schemes for pseudomonotone stochastic variational inequality problems and their variants
- Incremental Constraint Projection Methods for Monotone Stochastic Variational Inequalities
- A fast stochastic approximation-based subgradient extragradient algorithm with variance reduction for solving stochastic variational inequality problems
- Variance-based single-call proximal extragradient algorithms for stochastic mixed variational inequalities
Cites work
- scientific article; zbMATH DE number 4164577 (Why is no real title available?)
- scientific article; zbMATH DE number 3534286 (Why is no real title available?)
- A Stochastic Approximation Method
- Accelerated schemes for a class of variational inequalities
- Addressing supply-side risk in uncertain power markets: stochastic Nash models, scalable algorithms and error analysis
- Convergence rate of \(\mathcal{O}(1/k)\) for optimistic gradient and extragradient methods in smooth convex-concave saddle point problems
- Convex analysis and monotone operator theory in Hilbert spaces
- Corrections to “A High-Order Internal Model Based Iterative Learning Control Scheme for Nonlinear Systems With Time-Iteration-Varying Parameters”
- Distributed computation of equilibria in monotone Nash games via iterative regularization techniques
- Equilibrium models and variational inequalities.
- Extragradient Method with Variance Reduction for Stochastic Variational Inequalities
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Generalized Nash equilibrium problems
- Incremental Constraint Projection Methods for Monotone Stochastic Variational Inequalities
- Incremental constraint projection methods for variational inequalities
- Incremental proximal methods for large scale convex optimization
- Introduction to stochastic programming.
- Lectures on stochastic programming. Modeling and theory.
- On smoothing, regularization, and averaging in stochastic approximation methods for stochastic variational inequality problems
- On stochastic mirror-prox algorithms for stochastic Cartesian variational inequalities: randomized block coordinate and optimal averaging schemes
- On the convergence properties of non-Euclidean extragradient methods for variational inequalities with generalized monotone operators
- Optimal stochastic extragradient schemes for pseudomonotone stochastic variational inequality problems and their variants
- Projected reflected gradient methods for monotone variational inequalities
- Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems
- Random algorithms for convex minimization problems
- Regularized Iterative Stochastic Approximation Methods for Stochastic Variational Inequality Problems
- Robust Stochastic Approximation Approach to Stochastic Programming
- Sample average approximation methods for a class of stochastic variational inequality problems
- Self-Tuned Stochastic Approximation Schemes for Non-Lipschitzian Stochastic Multi-User Optimization and Nash Games
- Solving variational inequalities with stochastic mirror-prox algorithm
- Stochastic Approximation Approaches to the Stochastic Variational Inequality Problem
- Stochastic first-order methods with random constraint projection
- Stochastic variational inequalities: residual minimization smoothing sample average approximations
- Temporal Difference Methods for General Projected Equations
- The subgradient extragradient method for solving variational inequalities in Hilbert space
- Variational Analysis
Cited in
(13)- Two fast variance-reduced proximal gradient algorithms for SMVIPs -- stochastic mixed variational inequality problems with suitable applications to stochastic network games and traffic assignment problems
- Dynamic stochastic projection method for multistage stochastic variational inequalities
- Incremental Constraint Projection Methods for Monotone Stochastic Variational Inequalities
- Variance-based single-call proximal extragradient algorithms for stochastic mixed variational inequalities
- Variable sample-size optimistic mirror descent algorithm for stochastic mixed variational inequalities
- A fast stochastic approximation-based subgradient extragradient algorithm with variance reduction for solving stochastic variational inequality problems
- Stochastic relaxed inertial forward-backward-forward splitting for monotone inclusions in Hilbert spaces
- Self adaptive inertial subgradient extragradient algorithms for solving pseudomonotone variational inequality problems
- Stochastic approximation for estimating the price of stability in stochastic Nash games
- Variable sample-size operator extrapolation algorithm for stochastic mixed variational inequalities
- Simple and optimal methods for stochastic variational inequalities. I: Operator extrapolation
- Complexity guarantees for an implicit smoothing-enabled method for stochastic MPECs
- Variance-based stochastic projection gradient method for two-stage co-coercive stochastic variational inequalities
This page was built for publication: On the analysis of variance-reduced and randomized projection variants of single projection schemes for monotone stochastic variational inequality problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2045192)