Accelerated gradient descent methods with line search
From MaRDI portal
Publication:5961879
DOI10.1007/s11075-009-9350-8zbMath1198.65104OpenAlexW2014424370MaRDI QIDQ5961879
Marko B. Miladinović, Predrag S. Stanimirović
Publication date: 16 September 2010
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-009-9350-8
Related Items
Hybridization rule applied on accelerated double step size optimization scheme, Unnamed Item, A transformation of accelerated double step size method for unconstrained optimization, Signal recovery with convex constrained nonlinear monotone equations through conjugate gradient hybrid approach, A family of three-term conjugate gradient projection methods with a restart procedure and their relaxed-inertial extensions for the constrained nonlinear pseudo-monotone equations with applications, Heterogeneous Mediation Analysis on Epigenomic PTSD and Traumatic Stress in a Predominantly African American Cohort, Accelerated Dai-Liao projection method for solving systems of monotone nonlinear equations with application to image deblurring, An accelerated double step size model in unconstrained optimization, A note on hybridization process applied on transformed double step size model, An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization, Accelerated double direction method for solving unconstrained optimization problems, Hybrid modification of accelerated double direction method, A survey of gradient methods for solving nonlinear optimization, New hybrid conjugate gradient and Broyden-Fletcher-Goldfarb-Shanno conjugate gradient methods, A novel value for the parameter in the Dai-Liao-type conjugate gradient method, A note on a multiplicative parameters gradient method, Hybridization of accelerated gradient descent method, Scalar correction method for finding least-squares solutions on Hilbert spaces and its applications, INITIAL IMPROVEMENT OF THE HYBRID ACCELERATED GRADIENT DESCENT PROCESS, Theory of functional connections applied to quadratic and nonlinear programming under equality constraints, Modified matrix-free methods for solving system of nonlinear equations, Accelerated multiple step-size methods for solving unconstrained optimization problems
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Modified two-point stepsize gradient methods for unconstrained optimization
- Analysis of monotone gradient methods
- Convergence of line search methods for unconstrained optimization
- Efficent line search algorithm for unconstrained optimization
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- On the asymptotic behaviour of some new gradient methods
- Optimization theory and methods. Nonlinear programming
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- Minimization of functions having Lipschitz continuous first partial derivatives
- Preconditioned Barzilai-Borwein method for the numerical solution of partial differential equations
- R-linear convergence of the Barzilai and Borwein gradient method
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- Gradient Method with Retards and Generalizations
- Alternate minimization gradient method
- Alternate step gradient method*
- On the Barzilai and Borwein choice of steplength for the gradient method
- Function minimization by conjugate gradients
- Convergence Conditions for Ascent Methods
- Convex Analysis
- On Steepest Descent
- The conjugate gradient method in extremal problems