Optimal methods of smooth convex minimization
From MaRDI portal
Publication:3720307
DOI10.1016/0041-5553(85)90100-4zbMath0591.90072OpenAlexW1991760653MaRDI QIDQ3720307
Arkadi Nemirovski, Yu. E. Nesterov
Publication date: 1985
Published in: USSR Computational Mathematics and Mathematical Physics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0041-5553(85)90100-4
Numerical mathematical programming methods (65K05) Convex programming (90C25) Programming in abstract spaces (90C48)
Related Items
Lower Bounds for Parallel and Randomized Convex Optimization, On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients, Scheduled Restart Momentum for Accelerated Stochastic Gradient Descent, Fast gradient methods for uniformly convex and weakly smooth problems, First-order methods of smooth convex optimization with inexact oracle, Numerical methods for some classes of variational inequalities with relatively strongly monotone operators, Accelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentum, First-order methods for convex optimization, On optimal universal first-order methods for minimizing heterogeneous sums, A simple nearly optimal restart scheme for speeding up first-order methods, General Hölder smooth convergence rates follow from specialized rates assuming growth bounds, Optimal Affine-Invariant Smooth Minimization Algorithms, Conditional gradient type methods for composite nonlinear and stochastic optimization, Inexact proximal stochastic second-order methods for nonconvex composite optimization, On lower complexity bounds for large-scale smooth convex optimization, A universal modification of the linear coupling method, Universal gradient methods for convex optimization problems, Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions, Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point, Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach, Regularized nonlinear acceleration, Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization, Sharpness, Restart, and Acceleration, Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity, Convergence Rates for Deterministic and Stochastic Subgradient Methods without Lipschitz Continuity, Unified Acceleration of High-Order Algorithms under General Hölder Continuity, Generalized Momentum-Based Methods: A Hamiltonian Perspective, Inexact model: a framework for optimization and variational inequalities, Universal intermediate gradient method for convex problems with inexact oracle, Restarting Frank-Wolfe: faster rates under Hölderian error bounds