Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems
From MaRDI portal
Publication:964960
DOI10.1016/j.cam.2009.12.031zbMath1223.65043OpenAlexW2084371857MaRDI QIDQ964960
Yasushi Narushima, Hiroshi Yabe, Michiya Kobayashi
Publication date: 21 April 2010
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cam.2009.12.031
global convergenceconjugate gradient methodleast squares problemsline searchstructured secant condition
Related Items
Application of a combination production function model, On Hager and Zhang's conjugate gradient method with guaranteed descent, On the Barzilai–Borwein gradient methods with structured secant equation for nonlinear least squares problems, A new descent algorithm using the three-step discretization method for solving unconstrained optimization problems, Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization, Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization, A structured diagonal Hessian approximation method with evaluation complexity analysis for nonlinear least squares, Unnamed Item, Some nonlinear conjugate gradient methods based on spectral scaling secant equations, A brief survey of methods for solving nonlinear least-squares problems, A conjugate gradient method with sufficient descent property, Scaled nonlinear conjugate gradient methods for nonlinear least squares problems, Structured spectral algorithm with a nonmonotone line search for nonlinear least squares
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- Convergence theory for the structured BFGS secant method with an application to nonlinear least squares
- New quasi-Newton equation and related methods for unconstrained optimization
- Multi-step quasi-Newton methods for optimization
- Quadratic and superlinear convergence of the Huschens method for nonlinear least squares problems
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Comparison of some conjugate direction procedures for function minimization
- Convergence Properties of Algorithms for Nonlinear Optimization
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search
- Variational Methods for Non-Linear Least-Squares
- Technical Note—A Modified Conjugate Gradient Algorithm
- Testing Unconstrained Optimization Software
- An Adaptive Nonlinear Least-Squares Algorithm
- Local and Superlinear Convergence for Partially Known Quasi-Newton Methods
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- On the Use of Product Structure in Secant Methods for Nonlinear Least Squares Problems
- LOCAL AND SUPERLINEAR CONVERGENCE OF STRUCTURED QUASI-NEWTON METHODS FOR NONLINEAR OPTIMIZATION
- Line search algorithms with guaranteed sufficient decrease
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Function minimization by conjugate gradients
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations