Multi-step nonlinear conjugate gradient methods for unconstrained minimization
From MaRDI portal
Publication:953210
DOI10.1007/S10589-007-9087-ZzbMATH Open1181.90221OpenAlexW2022345660MaRDI QIDQ953210FDOQ953210
Authors: J. A. Ford, Yasushi Narushima, Hiroshi Yabe
Publication date: 17 November 2008
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-007-9087-z
Recommendations
- scientific article; zbMATH DE number 2196505
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Two new conjugate gradient methods based on modified secant equations
- A NEW NONLINEAR CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION
global convergenceunconstrained optimizationconjugate gradient methodline searchmulti-step secant condition
Cites Work
- Testing Unconstrained Optimization Software
- Numerical Optimization
- Function minimization by conjugate gradients
- Line search algorithms with guaranteed sufficient decrease
- Title not available (Why is that?)
- Technical Note—A Modified Conjugate Gradient Algorithm
- Title not available (Why is that?)
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Title not available (Why is that?)
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- New quasi-Newton equation and related methods for unconstrained optimization
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- A truncated Newton method with non-monotone line search for unconstrained optimization
- Conjugate Gradient Methods with Inexact Searches
- Multi-step quasi-Newton methods for optimization
Cited In (33)
- Convergence of multi-step curve search method for unconstrained optimization
- A descent Dai-Liao conjugate gradient method based on a modified secant equation and its global convergence
- Adaptive three-term family of conjugate residual methods for system of monotone nonlinear equations
- On a scaled symmetric Dai-Liao-type scheme for constrained system of nonlinear equations with applications
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- Globally convergent modified Perry's conjugate gradient method
- An adaptive three-term conjugate gradient method with sufficient descent condition and conjugacy condition
- Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems
- An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix
- Conjugate gradient methods using value of objective function for unconstrained optimization
- A modified Perry conjugate gradient method and its global convergence
- Multiple search direction conjugate gradient method I: methods and their propositions
- A descent family of Dai-Liao conjugate gradient methods
- A new class of efficient and globally convergent conjugate gradient methods in the Dai-Liao family
- Nonlinear conjugate gradient methods for unconstrained optimization
- Two new conjugate gradient methods based on modified secant equations
- An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition
- A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations
- A new accelerated conjugate gradient method for large-scale unconstrained optimization
- A modified two-point stepsize gradient algorithm for unconstrained minimization
- A hybrid conjugate gradient method based on a quadratic relaxation of the Dai-Yuan hybrid conjugate gradient parameter
- Enhanced Dai-Liao conjugate gradient methods for systems of monotone nonlinear equations
- A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update
- Scaled nonlinear conjugate gradient methods for nonlinear least squares problems
- The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
- Descent Perry conjugate gradient methods for systems of monotone nonlinear equations
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- A smoothing conjugate gradient method for solving systems of nonsmooth equations
- Some modified Yabe–Takano conjugate gradient methods with sufficient descent condition
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- A limited memory descent Perry conjugate gradient method
- Two-step conjugate gradient method for unconstrained optimization
- A new conjugate gradient algorithm for training neural networks based on a modified secant equation
Uses Software
This page was built for publication: Multi-step nonlinear conjugate gradient methods for unconstrained minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q953210)