On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients
From MaRDI portal
(Redirected from Publication:315517)
Recommendations
- Faster subgradient methods for functions with Hölderian growth
- Convergence rates for deterministic and stochastic subgradient methods without Lipschitz continuity
- The method of gradient descent for minimizing non-convex functions
- General Hölder smooth convergence rates follow from specialized rates assuming growth bounds
- Zeroth-order methods for noisy Hölder-gradient functions
Cites work
- scientific article; zbMATH DE number 4079168 (Why is no real title available?)
- scientific article; zbMATH DE number 3790208 (Why is no real title available?)
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training
- A gradient-based continuous method for large-scale optimization problems
- A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
- A nonmonotone spectral projected gradient method for large-scale topology optimization problems
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- An alternating direction approximate Newton algorithm for ill-conditioned inverse problems with application to parallel MRI
- An efficient gradient method using the Yuan steplength
- An optimal method for stochastic composite optimization
- Bregman operator splitting with variable stepsize for total variation image reconstruction
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- First-order methods of smooth convex optimization with inexact oracle
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Introductory lectures on convex optimization. A basic course.
- On Steepest Descent
- On the Barzilai and Borwein choice of steplength for the gradient method
- Optimal methods of smooth convex minimization
- Optimal primal-dual methods for a class of saddle point problems
- Relaxed steepest descent and Cauchy-Barzilai-Borwein method
- Smooth minimization of non-smooth functions
- Some descent three-term conjugate gradient methods and their global convergence
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- The Limited Memory Conjugate Gradient Method
- Two-Point Step Size Gradient Methods
- Universal gradient methods for convex optimization problems
Cited in
(9)- Cyclic coordinate descent in the Hölder smooth setting
- The backtrack Hölder gradient method with application to min-max and min-min problems
- Gradient descent in the absence of global Lipschitz continuity of the gradients
- Sobolev gradient preconditioning for elliptic reaction-diffusion problems with some nonsmooth nonlinearities
- General Hölder smooth convergence rates follow from specialized rates assuming growth bounds
- Generalized uniformly optimal methods for nonlinear programming
- Universal nonmonotone line search method for nonconvex multiobjective optimization problems with convex constraints
- On the quality of first-order approximation of functions with Hölder continuous gradient
- On the complexity of solving feasibility problems with regularized models
This page was built for publication: On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q315517)