Pages that link to "Item:Q3720307"
From MaRDI portal
The following pages link to Optimal methods of smooth convex minimization (Q3720307):
Displaying 30 items.
- On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients (Q315517) (← links)
- First-order methods of smooth convex optimization with inexact oracle (Q403634) (← links)
- On lower complexity bounds for large-scale smooth convex optimization (Q478994) (← links)
- Universal gradient methods for convex optimization problems (Q494332) (← links)
- Conditional gradient type methods for composite nonlinear and stochastic optimization (Q1717236) (← links)
- Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions (Q2020604) (← links)
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach (Q2031939) (← links)
- Restarting Frank-Wolfe: faster rates under Hölderian error bounds (Q2116603) (← links)
- Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point (Q2278192) (← links)
- Regularized nonlinear acceleration (Q2288185) (← links)
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity (Q2311123) (← links)
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization (Q2515032) (← links)
- Fast gradient methods for uniformly convex and weakly smooth problems (Q2673504) (← links)
- Numerical methods for some classes of variational inequalities with relatively strongly monotone operators (Q2680746) (← links)
- A simple nearly optimal restart scheme for speeding up first-order methods (Q2696573) (← links)
- General Hölder smooth convergence rates follow from specialized rates assuming growth bounds (Q2696991) (← links)
- A universal modification of the linear coupling method (Q4631767) (← links)
- Lower Bounds for Parallel and Randomized Convex Optimization (Q4969036) (← links)
- Unified Acceleration of High-Order Algorithms under General Hölder Continuity (Q5003214) (← links)
- Scheduled Restart Momentum for Accelerated Stochastic Gradient Descent (Q5094616) (← links)
- Inexact proximal stochastic second-order methods for nonconvex composite optimization (Q5135256) (← links)
- Sharpness, Restart, and Acceleration (Q5210521) (← links)
- Convergence Rates for Deterministic and Stochastic Subgradient Methods without Lipschitz Continuity (Q5231668) (← links)
- Optimal Affine-Invariant Smooth Minimization Algorithms (Q5376450) (← links)
- Generalized Momentum-Based Methods: A Hamiltonian Perspective (Q5857293) (← links)
- Inexact model: a framework for optimization and variational inequalities (Q5865338) (← links)
- Universal intermediate gradient method for convex problems with inexact oracle (Q5865342) (← links)
- Accelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentum (Q6101532) (← links)
- First-order methods for convex optimization (Q6169988) (← links)
- On optimal universal first-order methods for minimizing heterogeneous sums (Q6191975) (← links)