A version of the mirror descent method to solve variational inequalities
From MaRDI portal
Publication:681901
DOI10.1007/s10559-017-9923-9zbMath1384.49014OpenAlexW2603831552MaRDI QIDQ681901
Publication date: 13 February 2018
Published in: Cybernetics and Systems Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10559-017-9923-9
convergencevariational inequalitypseudomonotonicityKullback-Leibler distanceBregman distancemirror descent method
Related Items (13)
Modified extragradient method with Bregman distance for variational inequalities ⋮ Two Bregman projection methods for solving variational inequalities ⋮ An explicit extragradient algorithm for solving variational inequalities ⋮ Adaptive two-stage Bregman method for variational inequalities ⋮ Modified extragradient-like algorithms with new stepsizes for variational inequalities ⋮ Adaptive extraproximal algorithm for the equilibrium problem in Hadamard spaces ⋮ Stochastic incremental mirror descent algorithms with Nesterov smoothing ⋮ Convergence of a two-stage proximal algorithm for the equilibrium problem in Hadamard spaces ⋮ An adaptive two-stage proximal algorithm for equilibrium problems in Hadamard spaces ⋮ Inertial hybrid splitting methods for operator inclusion problems ⋮ An adaptive algorithm for the variational inequality over the set of solutions of the equilibrium problem ⋮ Convergence of two-stage method with Bregman divergence for solving variational inequalities ⋮ Bregman extragradient method with monotone rule of step adjustment
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Low-cost modification of Korpelevich's methods for monotone equilibrium problems
- A hybrid method without extrapolation step for solving variational inequality problems
- The subgradient extragradient method for solving variational inequalities in Hilbert space
- Dual extrapolation and its applications to solving variational inequalities and related problems
- Convergence of the modified extragradient method for variational inequalities with non-Lipschitz operators
- Combined relaxation methods for variational inequalities
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- An extragradient algorithm for monotone variational inequalities
- Hybrid splitting methods for the system of operator inclusions with monotone operators
- An approximate method of ellipsoids
- Interior projection-like methods for monotone variational inequalities
- The Ordered Subsets Mirror Descent Optimization Method with Applications to Tomography
- A Randomized Mirror-Prox Method for Solving Structured Large-Scale Matrix Saddle-Point Problems
- Strongly Convergent Algorithms for Variational Inequality Problem Over the Set of Solutions the Equilibrium Problems
- Modification of the extra-gradient method for solving variational inequalities and certain optimization problems
- Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent
- Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems
- A Modified Forward-Backward Splitting Method for Maximal Monotone Mappings
- Solving variational inequalities with Stochastic Mirror-Prox algorithm
This page was built for publication: A version of the mirror descent method to solve variational inequalities