Accelerated primal-dual gradient descent with linesearch for convex, nonconvex, and nonsmooth optimization problems
From MaRDI portal
Publication:2313241
DOI10.1134/S1064562419020042zbMath1418.90192OpenAlexW2951523010WikidataQ127677930 ScholiaQ127677930MaRDI QIDQ2313241
S. V. Guminov, Alexander V. Gasnikov, Yu. E. Nesterov, Pavel Dvurechensky
Publication date: 18 July 2019
Published in: Doklady Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1134/s1064562419020042
Related Items
Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle ⋮ Stochastic saddle-point optimization for the Wasserstein barycenter problem ⋮ Composite optimization for the resource allocation problem ⋮ Numerical methods for the resource allocation problem in a computer network ⋮ Nesterov's Method for Convex Optimization ⋮ Radial duality. II: Applications and algorithms ⋮ First-order methods for convex optimization ⋮ Recent Theoretical Advances in Non-Convex Optimization ⋮ Alternating minimization methods for strongly convex optimization ⋮ Generalized Momentum-Based Methods: A Hamiltonian Perspective ⋮ Inexact model: a framework for optimization and variational inequalities ⋮ Universal intermediate gradient method for convex problems with inexact oracle
Cites Work
- Primal-dual subgradient methods for convex problems
- Smooth minimization of non-smooth functions
- Universal gradient methods for convex optimization problems
- An interior algorithm for nonlinear optimization that combines line search and trust region steps
- Fast Primal-Dual Gradient Method for Strongly Convex Minimization Problems with Linear Constraints
- A universal modification of the linear coupling method
- Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent