An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
From MaRDI portal
Publication:2502232
DOI10.1007/s11075-006-9023-9zbMath1101.65058OpenAlexW1989566487MaRDI QIDQ2502232
Publication date: 12 September 2006
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-006-9023-9
algorithmunconstrained optimizationnumerical examplesconvergence accelerationconvex programmingbacktrackinggradient descent methods
Related Items (36)
Accelerated memory-less SR1 method with generalized secant equation for unconstrained optimization ⋮ A note on memory-less SR1 and memory-less BFGS methods for large-scale unconstrained optimization ⋮ Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization ⋮ Two accelerated nonmonotone adaptive trust region line search methods ⋮ Hybridization rule applied on accelerated double step size optimization scheme ⋮ Unnamed Item ⋮ A transformation of accelerated double step size method for unconstrained optimization ⋮ A subspace conjugate gradient algorithm for large-scale unconstrained optimization ⋮ An accelerated descent CG algorithm with clustering the eigenvalues for large-scale nonconvex unconstrained optimization and its application in image restoration problems ⋮ An Accelerated Three-Term Extension of a Descent Nonlinear Conjugate Gradient Method ⋮ Diagonal approximation of the Hessian by finite differences for unconstrained optimization ⋮ A class of spectral three-term descent Hestenes-Stiefel conjugate gradient algorithms for large-scale unconstrained optimization and image restoration problems ⋮ A regularized model for wetting/dewetting problems: positivity and asymptotic analysis ⋮ A class of new three-term descent conjugate gradient algorithms for large-scale unconstrained optimization and applications to image restoration problems ⋮ An accelerated double step size model in unconstrained optimization ⋮ Smoothed \(\ell_1\)-regularization-based line search for sparse signal recovery ⋮ An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization ⋮ Accelerated double direction method for solving unconstrained optimization problems ⋮ A survey of gradient methods for solving nonlinear optimization ⋮ A New Diagonal Quasi-Newton Updating Method With Scaled Forward Finite Differences Directional Derivative for Unconstrained Optimization ⋮ New hybrid conjugate gradient and Broyden-Fletcher-Goldfarb-Shanno conjugate gradient methods ⋮ A novel value for the parameter in the Dai-Liao-type conjugate gradient method ⋮ A note on a multiplicative parameters gradient method ⋮ A class of accelerated subspace minimization conjugate gradient methods ⋮ A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization ⋮ Hybridization of accelerated gradient descent method ⋮ Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization ⋮ On three-term conjugate gradient algorithms for unconstrained optimization ⋮ An accelerated conjugate gradient algorithm with guaranteed descent and conjugacy conditions for unconstrained optimization ⋮ Accelerated gradient descent methods with line search ⋮ Computer Algebra and Line Search ⋮ Acceleration of conjugate gradient algorithms for unconstrained optimization ⋮ A new steepest descent method with global convergence properties ⋮ A modified Polak–Ribière–Polyak conjugate gradient algorithm for unconstrained optimization ⋮ MULTIPLE USE OF BACKTRACKING LINE SEARCH IN UNCONSTRAINED OPTIMIZATION ⋮ Accelerated multiple step-size methods for solving unconstrained optimization problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Convergence of line search methods for unconstrained optimization
- Efficent line search algorithm for unconstrained optimization
- Minimization of functions having Lipschitz continuous first partial derivatives
- CUTE
- Convergence Conditions for Ascent Methods
- On Steepest Descent
- Benchmarking optimization software with performance profiles.
This page was built for publication: An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion