Solving variational inequalities with monotone operators on domains given by linear minimization oracles
DOI10.1007/S10107-015-0876-3zbMATH Open1333.65074arXiv1312.1073OpenAlexW2119157404WikidataQ57392869 ScholiaQ57392869MaRDI QIDQ263192FDOQ263192
Anatoli Juditsky, Arkadi Nemirovski
Publication date: 4 April 2016
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1312.1073
Convex programming (90C25) Pattern recognition, speech recognition (68T10) Minimax problems in mathematical programming (90C47) Numerical methods for variational inequalities and related problems (65K15)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- Convex Analysis
- Conditional gradient algorithms with open loop step size rules
- New variants of bundle methods
- Randomized first order algorithms with applications to \(\ell _{1}\)-minimization
- On a unified view of nullspace-type conditions for recoveries associated with general sparsity structures
- Dual variational inequalities
- Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
- Accuracy Certificates for Computational Problems with Convex Structure
- Dual subgradient algorithms for large-scale nonsmooth learning problems
- Conditional gradient algorithms for norm-regularized smooth convex optimization
- Minimax Theorems and Conjugate Saddle-Functions.
- On first-order algorithms forl1/nuclear norm minimization
- Variational inequalities and flow in porous media
Cited In (6)
- Variational Gram Functions: Convex Analysis and Optimization
- Decomposition techniques for bilinear saddle point problems and variational inequalities with affine monotone operators
- Nonsmooth projection-free optimization with functional constraints
- Analysis of two versions of relaxed inertial algorithms with Bregman divergences for solving variational inequalities
- Complexity bounds for primal-dual methods minimizing the model of objective function
- Derivative-free alternating projection algorithms for general nonconvex-concave minimax problems
This page was built for publication: Solving variational inequalities with monotone operators on domains given by linear minimization oracles
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q263192)