Primal–dual accelerated gradient methods with small-dimensional relaxation oracle
From MaRDI portal
Publication:5085262
DOI10.1080/10556788.2020.1731747zbMath1489.90124arXiv1809.05895OpenAlexW3009707357MaRDI QIDQ5085262
Sergey Guminov, Pavel Dvurechensky, Yu. E. Nesterov, Alexander V. Gasnikov
Publication date: 27 June 2022
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1809.05895
Related Items
Network Utility Maximization by Updating Individual Transmission Rates, Improved exploitation of higher order smoothness in derivative-free optimization, Stochastic saddle-point optimization for the Wasserstein barycenter problem, Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization, Accelerated methods for weakly-quasi-convex optimization problems, Multistage transportation model and sufficient conditions for its potentiality, Smooth monotone stochastic variational inequalities and saddle point problems: a survey, Recent Theoretical Advances in Non-Convex Optimization, Alternating minimization methods for strongly convex optimization, Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions, Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach, Inexact model: a framework for optimization and variational inequalities
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- Gradient methods for minimizing composite functions
- Universal gradient methods for convex optimization problems
- Information-based complexity of linear operator equations
- Introductory lectures on convex optimization. A basic course.
- A fast dual proximal gradient algorithm for convex minimization and applications
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
- Efficient first-order methods for convex minimization: a constructive approach
- Generalized uniformly optimal methods for nonlinear programming
- A stable alternative to Sinkhorn's algorithm for regularized optimal transport
- Fast Primal-Dual Gradient Method for Strongly Convex Minimization Problems with Linear Constraints
- Generalizing the Optimized Gradient Method for Smooth Convex Minimization
- A universal modification of the linear coupling method
- Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent
- New limited memory bundle method for large-scale nonsmooth optimization