Accelerated gradient descent methods with line search
From MaRDI portal
Publication:5961879
Recommendations
- Multiplicative parameters in gradient descent methods
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- scientific article; zbMATH DE number 1960973
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- Accelerated multiple step-size methods for solving unconstrained optimization problems
Cites work
- scientific article; zbMATH DE number 3529352 (Why is no real title available?)
- scientific article; zbMATH DE number 3381785 (Why is no real title available?)
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- Alternate minimization gradient method
- Alternate step gradient method*
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- An unconstrained optimization test functions collection
- Analysis of monotone gradient methods
- Convergence Conditions for Ascent Methods
- Convergence of line search methods for unconstrained optimization
- Convex Analysis
- Efficent line search algorithm for unconstrained optimization
- Function minimization by conjugate gradients
- Gradient Method with Retards and Generalizations
- Minimization of functions having Lipschitz continuous first partial derivatives
- Modified two-point stepsize gradient methods for unconstrained optimization
- On Steepest Descent
- On the Barzilai and Borwein choice of steplength for the gradient method
- On the asymptotic behaviour of some new gradient methods
- Optimization theory and methods. Nonlinear programming
- Preconditioned Barzilai-Borwein method for the numerical solution of partial differential equations
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- The conjugate gradient method in extremal problems
- Two-Point Step Size Gradient Methods
- \(R\)-linear convergence of the Barzilai and Borwein gradient method
Cited in
(29)- New hybrid conjugate gradient and Broyden-Fletcher-Goldfarb-Shanno conjugate gradient methods
- A note on a multiplicative parameters gradient method
- Signal recovery with convex constrained nonlinear monotone equations through conjugate gradient hybrid approach
- A family of three-term conjugate gradient projection methods with a restart procedure and their relaxed-inertial extensions for the constrained nonlinear pseudo-monotone equations with applications
- Accelerated Dai-Liao projection method for solving systems of monotone nonlinear equations with application to image deblurring
- Initial improvement of the hybrid accelerated gradient descent process
- Accelerated double direction method for solving unconstrained optimization problems
- An accelerated double step size model in unconstrained optimization
- Accelerated multiple step-size methods for solving unconstrained optimization problems
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- An improved derivative-free method via double direction approach for solving systems of nonlinear equations
- A novel value for the parameter in the Dai-Liao-type conjugate gradient method
- Accelerated line-search and trust-region methods
- Modified matrix-free methods for solving system of nonlinear equations
- Hybrid modification of accelerated double direction method
- Heterogeneous Mediation Analysis on Epigenomic PTSD and Traumatic Stress in a Predominantly African American Cohort
- Hybridization rule applied on accelerated double step size optimization scheme
- Hybridization of accelerated gradient descent method
- Theory of functional connections applied to quadratic and nonlinear programming under equality constraints
- Multiplicative parameters in gradient descent methods
- An accelerated minimal gradient method with momentum for strictly convex quadratic optimization
- Finding approximate local minima faster than gradient descent
- A note on hybridization process applied on transformed double step size model
- Scalar correction method for finding least-squares solutions on Hilbert spaces and its applications
- Accelerated primal-dual gradient descent with linesearch for convex, nonconvex, and nonsmooth optimization problems
- An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization
- Primal-dual accelerated gradient methods with small-dimensional relaxation oracle
- A transformation of accelerated double step size method for unconstrained optimization
- A survey of gradient methods for solving nonlinear optimization
This page was built for publication: Accelerated gradient descent methods with line search
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5961879)