Scaled nonlinear conjugate gradient methods for nonlinear least squares problems
From MaRDI portal
Publication:2322819
DOI10.1007/s11075-018-0591-2zbMath1421.90166OpenAlexW2895570191MaRDI QIDQ2322819
Publication date: 5 September 2019
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-018-0591-2
nonlinear least squaresglobal convergencesscaled nonlinear conjugate gradientstructured secant relation
Numerical mathematical programming methods (65K05) Methods of quasi-Newton type (90C53) Numerical methods based on nonlinear programming (49M37)
Related Items (9)
Chebyshev pseudospectral approximation of two dimensional fractional Schrodinger equation on a convex and rectangular domain ⋮ On the Barzilai–Borwein gradient methods with structured secant equation for nonlinear least squares problems ⋮ An Accelerated Three-Term Extension of a Descent Nonlinear Conjugate Gradient Method ⋮ A descent extension of a modified Polak-Ribière-Polyak method with application in image restoration problem ⋮ Unnamed Item ⋮ Modified Newton integration algorithm with noise suppression for online dynamic nonlinear optimization ⋮ A structured quasi-Newton algorithm with nonmonotone search strategy for structured NLS problems and its application in robotic motion control ⋮ Structured spectral algorithm with a nonmonotone line search for nonlinear least squares ⋮ Structured diagonal Gauss-Newton method for nonlinear least squares
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei
- Convergence analysis of a modified BFGS method on convex minimizations
- A modified quasi-Newton method for structured optimization with partial information on the Hessian
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems
- A new structured quasi-Newton algorithm using partial information on Hessian
- Two new conjugate gradient methods based on modified secant equations
- Convergence theory for the structured BFGS secant method with an application to nonlinear least squares
- New quasi-Newton equation and related methods for unconstrained optimization
- On the quadratic convergence of the Levenberg-Marquardt method without nonsingularity assumption
- Globally convergent variable metric method for convex nonsmooth unconstrained minimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- On the convergence of the modified Levenberg-Marquardt method with a nonmonotone second order Armijo type line search
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- New quasi-Newton methods for unconstrained optimization problems
- Convergence Theorems for Least-Change Secant Update Methods
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Algorithm 851
- Variational Methods for Non-Linear Least-Squares
- Two-Point Step Size Gradient Methods
- An Adaptive Nonlinear Least-Squares Algorithm
- Local and Superlinear Convergence for Partially Known Quasi-Newton Methods
- Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
- Quasi-Newton Methods, Motivation and Theory
- Matrix conditioning and nonlinear optimization
- Conjugate Gradient Methods with Inexact Searches
- On the Use of Product Structure in Secant Methods for Nonlinear Least Squares Problems
- LOCAL AND SUPERLINEAR CONVERGENCE OF STRUCTURED QUASI-NEWTON METHODS FOR NONLINEAR OPTIMIZATION
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A Class of Methods for Solving Nonlinear Simultaneous Equations
- Function minimization by conjugate gradients
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- The conjugate gradient method in extremal problems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A spectral conjugate gradient method for unconstrained optimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.
This page was built for publication: Scaled nonlinear conjugate gradient methods for nonlinear least squares problems