A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization
From MaRDI portal
Publication:555476
DOI10.1016/j.amc.2011.05.032zbMath1232.65096OpenAlexW2171032533MaRDI QIDQ555476
Publication date: 22 July 2011
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2011.05.032
global convergencesteepest descentnonlinear conjugate gradient methodLiu-Storey methodPolak-Ribiere-Polyak methodstep modification
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26)
Related Items (14)
Notes on the complete elliptic integral of the first kind ⋮ Schur convexity and inequalities for a multivariate symmetric function ⋮ Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence ⋮ Dai-Kou type conjugate gradient methods with a line search only using gradient ⋮ A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs ⋮ A hybrid self-adjusted mean value method for reliability-based design optimization using sufficient descent condition ⋮ A hybrid conjugate finite-step length method for robust and efficient reliability analysis ⋮ Sharp bounds for Sándor-Yang means in terms of one-parameter family of bivariate means ⋮ Petrović-type inequalities for harmonic \(h\)-convex functions ⋮ Optimal two-parameter geometric and arithmetic mean bounds for the Sándor-Yang mean ⋮ A note on generalized convex functions ⋮ Some new fractional integral inequalities for exponentially \(m\)-convex functions via extended generalized Mittag-Leffler function ⋮ Hermite-Hadamard type inequalities for co-ordinated convex and qausi-convex functions and their applications ⋮ Monotonicity properties and bounds involving the two-parameter generalized Grötzsch ring function
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The convergence properties of some new conjugate gradient methods
- On the limited memory BFGS method for large scale optimization
- Efficient generalized conjugate gradient algorithms. I: Theory
- A modified PRP conjugate gradient method
- Optimization. Algorithms and consistent approximations
- A globally convergent version of the Polak-Ribière conjugate gradient method
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- Scaled conjugate gradient algorithms for unconstrained optimization
- A three-parameter family of nonlinear conjugate gradient methods
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- CUTE
- Line search algorithms with guaranteed sufficient decrease
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- Some descent three-term conjugate gradient methods and their global convergence
- Convergence Conditions for Ascent Methods
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization