A survey of gradient methods for solving nonlinear optimization
From MaRDI portal
Publication:2220680
DOI10.3934/era.2020115zbMath1458.65067OpenAlexW3102744735MaRDI QIDQ2220680
Branislav Ivanov, Hai Feng Ma, Dijana Mosić, Predrag S. Stanimirović
Publication date: 25 January 2021
Published in: Electronic Research Archive (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3934/era.2020115
unconstrained optimizationglobal convergencenonlinear programminggradient methodsconjugate gradient methodsline search method
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53) Methods of reduced gradient type (90C52)
Related Items
Solving nonlinear equations with a direct Broyden method and its acceleration ⋮ Accelerated Dai-Liao projection method for solving systems of monotone nonlinear equations with application to image deblurring ⋮ Adaptive neural network surrogate model for solving the implied volatility of time-dependent American option via Bayesian inference ⋮ Theory of functional connections applied to quadratic and nonlinear programming under equality constraints
Uses Software
Cites Work
- Function minimization by conjugate gradients
- A Two-Term PRP-Based Descent Method
- The Conjugate Gradient Method for Linear and Nonlinear Operator Equations
- Convergence Conditions for Ascent Methods
- On Steepest Descent
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Mean Value Methods in Iteration
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
- An efficient hybrid conjugate gradient method for unconstrained optimization
- Relaxed steepest descent and Cauchy-Barzilai-Borwein method
- Accelerated gradient descent methods with line search
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence
- Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property
- Two modifications of the method of the multiplicative parameters in descent gradient methods
- A continuous-time approach to online optimization
- A modified Fletcher-Reeves-type derivative-free method for symmetric nonlinear equations
- A class of globally convergent conjugate gradient methods
- Scalar correction method for solving large scale unconstrained minimization problems
- Computing \(\{2,4\}\) and \(\{2,3\}\)-inverses by using the Sherman-Morrison formula
- Global convergence of nonmonotone descent methods for unconstrained optimization problems
- A nonmonotone conjugate gradient algorithm for unconstrained optimization
- Convergence analysis of a modified BFGS method on convex minimizations
- A new type of quasi-Newton updating formulas based on the new quasi-Newton equation
- Survey of derivative-free optimization
- New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
- The convergence properties of some new conjugate gradient methods
- An accelerated double step size model in unconstrained optimization
- Efficient generalized conjugate gradient algorithms. I: Theory
- A note about WYL's conjugate gradient method and its applications
- An improved Wei-Yao-Liu nonlinear conjugate gradient method for optimization computation
- A truncated Newton method with non-monotone line search for unconstrained optimization
- Efficient hybrid conjugate gradient techniques
- Stepsize analysis for descent methods
- Convergence of quasi-Newton matrices generated by the symmetric rank one update
- Global convergence result for conjugate gradient methods
- Non-monotone trust-region algorithms for nonlinear optimization subject to convex constraints
- A classification of quasi-Newton methods
- Modified two-point stepsize gradient methods for unconstrained optimization
- A transformation of accelerated double step size method for unconstrained optimization
- Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems
- On a two-phase approximate greatest descent method for nonlinear optimization with equality constraints
- Approximate greatest descent in neural network optimization
- Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization
- A risk minimization problem for finite horizon semi-Markov decision processes with loss rates
- Accelerated double direction method for solving unconstrained optimization problems
- Hybrid modification of accelerated double direction method
- New hybrid conjugate gradient and Broyden-Fletcher-Goldfarb-Shanno conjugate gradient methods
- A projected preconditioned conjugate gradient method for the linear response eigenvalue problem
- A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues
- Analysis of monotone gradient methods
- A conjugate gradient method with global convergence for large-scale unconstrained optimization problems
- Nonmonotone globalization techniques for the Barzilai-Borwein gradient method
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Convergence of line search methods for unconstrained optimization
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- Hybridization of accelerated gradient descent method
- A projection method for convex constrained monotone nonlinear equations with applications
- A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems
- A conjugate gradient algorithm and its application in large-scale optimization problems and image restoration
- Nonlinear conjugate gradient methods for unconstrained optimization
- A note on hybridization process applied on transformed double step size model
- A global convergence of LS-CD hybrid conjugate gradient method
- A conjugate gradient method to solve convex constrained monotone equations with applications in compressive sensing
- A brief survey of methods for solving nonlinear least-squares problems
- Semi-local convergence of the Newton-HSS method under the center Lipschitz condition
- A new reprojection of the conjugate directions
- Identification of Hessian matrix in distributed gradient-based multi-agent coordination control systems
- A class on nonmonotone stabilization methods in unconstrained optimization
- An efficient Dai-Liao type conjugate gradient method by reformulating the CG parameter in the search direction equation
- A sufficient descent nonlinear conjugate gradient method for solving \(\mathcal{M} \)-tensor equations
- A conjugate gradient algorithm and its applications in image restoration
- A hybrid conjugate gradient method with descent property for unconstrained optimization
- A new modified three-term conjugate gradient method with sufficient descent property and its global convergence
- A Picard-Mann hybrid iterative process
- New quasi-Newton methods for unconstrained optimization problems
- Optimization theory and methods. Nonlinear programming
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- Minimization of functions having Lipschitz continuous first partial derivatives
- R-linear convergence of the Barzilai and Borwein gradient method
- A sufficient descent conjugate gradient method and its global convergence
- A Modified PRP Conjugate Gradient Algorithm with Trust Region for Optimization Problems
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- A new class of efficient and globally convergent conjugate gradient methods in the Dai–Liao family
- Algorithm 851
- Applications of the Conjugate Gradient method in optimal surface parameterizations
- MODIFIED LIMITED MEMORY BFGS METHOD WITH NONMONOTONE LINE SEARCH FOR UNCONSTRAINED OPTIMIZATION
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Two-Point Step Size Gradient Methods
- Technical Note—A Modified Conjugate Gradient Algorithm
- The application of quasi-Newton methods in fluid mechanics
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Algorithms for nonlinear constraints that use lagrangian functions
- Numerical Optimization
- CUTE
- Alternate minimization gradient method
- Computation of {2,4} and {2,3}-inverses based on rank-one updates
- INITIAL IMPROVEMENT OF THE HYBRID ACCELERATED GRADIENT DESCENT PROCESS
- A Nonmonotone Line Search Technique for Newton’s Method
- Fixed Points by a New Iteration Method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Introduction to Unconstrained Optimization with R
- On the Barzilai and Borwein choice of steplength for the gradient method
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search