Optimal methods of smooth convex minimization
From MaRDI portal
Recommendations
- scientific article; zbMATH DE number 4079168
- One class of methods of unconditional minimization of a convex function, having a high rate of convergence
- scientific article; zbMATH DE number 3910150
- An optimal gradient method for smooth strongly convex minimization
- Minimization methods for smooth nonconvex functions
Cited in
(40)- Restarting Frank-Wolfe: faster rates under Hölderian error bounds
- A universal modification of the linear coupling method
- First-order methods of smooth convex optimization with inexact oracle
- Inexact model: a framework for optimization and variational inequalities
- Accelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentum
- First-order methods for convex optimization
- Optimal Affine-Invariant Smooth Minimization Algorithms
- Universal intermediate gradient method for convex problems with inexact oracle
- Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions
- Stochastic algorithms with geometric step decay converge linearly on sharp functions
- Convergence rates for deterministic and stochastic subgradient methods without Lipschitz continuity
- Sharpness, restart, and acceleration
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach
- Universal gradient methods for convex optimization problems
- scientific article; zbMATH DE number 4079168 (Why is no real title available?)
- On lower complexity bounds for large-scale smooth convex optimization
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- Inexact proximal stochastic second-order methods for nonconvex composite optimization
- On optimal universal first-order methods for minimizing heterogeneous sums
- OPTIMAL HYPER-MINIMIZATION
- General Hölder smooth convergence rates follow from specialized rates assuming growth bounds
- A simple nearly optimal restart scheme for speeding up first-order methods
- Complementary composite minimization, small gradients in general norms, and applications
- Smooth Optimization Methods for Minimax Problems
- On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients
- Numerical methods for some classes of variational inequalities with relatively strongly monotone operators
- Fast gradient descent for convex minimization problems with an oracle producing a ( , L)-model of function at the requested point
- Generalized momentum-based methods: a Hamiltonian perspective
- Scheduled restart momentum for accelerated stochastic gradient descent
- One class of methods of unconditional minimization of a convex function, having a high rate of convergence
- scientific article; zbMATH DE number 4164577 (Why is no real title available?)
- Perseus: a simple and optimal high-order method for variational inequalities
- Fast gradient methods for uniformly convex and weakly smooth problems
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- scientific article; zbMATH DE number 3961354 (Why is no real title available?)
- Unified acceleration of high-order algorithms under general Hölder continuity
- Lower bounds for parallel and randomized convex optimization
- Regularized nonlinear acceleration
- scientific article; zbMATH DE number 4121312 (Why is no real title available?)
- Conditional gradient type methods for composite nonlinear and stochastic optimization
This page was built for publication: Optimal methods of smooth convex minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3720307)