Inexact model: a framework for optimization and variational inequalities
DOI10.1080/10556788.2021.1924714zbMath1489.65089arXiv1902.00990OpenAlexW3178749228MaRDI QIDQ5865338
Pavel Dvurechensky, Dmitry Pasechnyuk, Alexander Tyurin, Mohammad S. Alkousa, Artem Agafonov, Sergei Yu. Artamonov, Victorya Piskunova, Fedor S. Stonyakin, Darina Dvinskikh, Alexander V. Gasnikov
Publication date: 13 June 2022
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1902.00990
convex optimizationvariational inequalityaccelerationlevel-set methodproximal methodsaddle-point problemcomposite optimizationrelative smoothnessuniversal methodmirror-prox
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Numerical methods for variational inequalities and related problems (65K15)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Primal-dual subgradient methods for convex problems
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient methods for minimizing composite functions
- First-order methods of smooth convex optimization with inexact oracle
- An optimal method for stochastic composite optimization
- Conditional gradient algorithms for norm-regularized smooth convex optimization
- Universal gradient methods for convex optimization problems
- Lectures on convex optimization
- Stochastic intermediate gradient method for convex problems with stochastic inexact oracle
- Dual approaches to the minimization of strongly convex functionals with a simple structure under affine constraints
- Two-level iterative method for non-stationary mixed variational inequalities
- Universal method for stochastic composite optimization problems
- Complexity bounds for primal-dual methods minimizing the model of objective function
- A first-order primal-dual algorithm for convex problems with applications to imaging
- Gradient methods for problems with inexact model of the objective
- Golden ratio algorithms for variational inequalities
- Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems
- Nonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteria
- Implementable tensor methods in unconstrained convex optimization
- Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point
- Accelerated primal-dual gradient descent with linesearch for convex, nonconvex, and nonsmooth optimization problems
- Universal method of searching for equilibria and stochastic equilibria in transportation networks
- An adaptive proximal method for variational inequalities
- Non-smooth non-convex Bregman minimization: unification and new algorithms
- Cubic regularization of Newton method and its global performance
- Some algorithms for solving mixed variational inequalities
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- Stochastic intermediate gradient method for convex optimization problems
- A stable alternative to Sinkhorn's algorithm for regularized optimal transport
- Fast Primal-Dual Gradient Method for Strongly Convex Minimization Problems with Linear Constraints
- Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
- Accuracy Certificates for Computational Problems with Convex Structure
- Optimal methods of smooth convex minimization
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Variance-Based Extragradient Methods with Line Search for Stochastic Variational Inequalities
- Proximal extrapolated gradient methods for variational inequalities
- Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems
- Random Gradient Extrapolation for Distributed and Stochastic Optimization
- Gradient methods with memory
- Primal–dual accelerated gradient methods with small-dimensional relaxation oracle
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- Universal intermediate gradient method for convex problems with inexact oracle