On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients
From MaRDI portal
Publication:315517
DOI10.1007/S11590-015-0936-XzbMATH Open1353.90151OpenAlexW2272094913MaRDI QIDQ315517FDOQ315517
Authors: Maryam Yashtini
Publication date: 21 September 2016
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11590-015-0936-x
Recommendations
- Faster subgradient methods for functions with Hölderian growth
- Convergence rates for deterministic and stochastic subgradient methods without Lipschitz continuity
- The method of gradient descent for minimizing non-convex functions
- General Hölder smooth convergence rates follow from specialized rates assuming growth bounds
- Zeroth-order methods for noisy Hölder-gradient functions
nonlinear programmingglobal convergenceconvergence rategradient descent methodupper complexity boundHölder continuous gradient
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Title not available (Why is that?)
- Smooth minimization of non-smooth functions
- Introductory lectures on convex optimization. A basic course.
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- Title not available (Why is that?)
- A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Some descent three-term conjugate gradient methods and their global convergence
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- First-order methods of smooth convex optimization with inexact oracle
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- An optimal method for stochastic composite optimization
- On Steepest Descent
- An efficient gradient method using the Yuan steplength
- A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- Universal gradient methods for convex optimization problems
- A gradient-based continuous method for large-scale optimization problems
- Bregman operator splitting with variable stepsize for total variation image reconstruction
- An alternating direction approximate Newton algorithm for ill-conditioned inverse problems with application to parallel MRI
- Optimal methods of smooth convex minimization
- Title not available (Why is that?)
- A nonmonotone spectral projected gradient method for large-scale topology optimization problems
- Optimal primal-dual methods for a class of saddle point problems
- On the Barzilai and Borwein choice of steplength for the gradient method
- The Limited Memory Conjugate Gradient Method
- Relaxed steepest descent and Cauchy-Barzilai-Borwein method
Cited In (9)
- Cyclic coordinate descent in the Hölder smooth setting
- The backtrack Hölder gradient method with application to min-max and min-min problems
- Gradient descent in the absence of global Lipschitz continuity of the gradients
- Sobolev gradient preconditioning for elliptic reaction-diffusion problems with some nonsmooth nonlinearities
- General Hölder smooth convergence rates follow from specialized rates assuming growth bounds
- Generalized uniformly optimal methods for nonlinear programming
- Universal nonmonotone line search method for nonconvex multiobjective optimization problems with convex constraints
- On the quality of first-order approximation of functions with Hölder continuous gradient
- On the complexity of solving feasibility problems with regularized models
Uses Software
This page was built for publication: On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q315517)