Scaled nonlinear conjugate gradient methods for nonlinear least squares problems
From MaRDI portal
Recommendations
- Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems
- A Family of Scaled Factorized Broyden-Like Methods for Nonlinear Least Squares Problems
- scientific article; zbMATH DE number 5073
- On the Barzilai–Borwein gradient methods with structured secant equation for nonlinear least squares problems
- Structured two-point stepsize gradient methods for nonlinear least squares
Cites work
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- scientific article; zbMATH DE number 3278849 (Why is no real title available?)
- scientific article; zbMATH DE number 2221955 (Why is no real title available?)
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- A Class of Methods for Solving Nonlinear Simultaneous Equations
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A modified quasi-Newton method for structured optimization with partial information on the Hessian
- A new structured quasi-Newton algorithm using partial information on Hessian
- A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei
- A spectral conjugate gradient method for unconstrained optimization
- A survey of nonlinear conjugate gradient methods
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
- Algorithm 851
- An Adaptive Nonlinear Least-Squares Algorithm
- Benchmarking optimization software with performance profiles.
- Conjugate Gradient Methods with Inexact Searches
- Convergence Theorems for Least-Change Secant Update Methods
- Convergence analysis of a modified BFGS method on convex minimizations
- Convergence theory for the structured BFGS secant method with an application to nonlinear least squares
- Function minimization by conjugate gradients
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- Globally convergent variable metric method for convex nonsmooth unconstrained minimization
- LOCAL AND SUPERLINEAR CONVERGENCE OF STRUCTURED QUASI-NEWTON METHODS FOR NONLINEAR OPTIMIZATION
- Local and Superlinear Convergence for Partially Known Quasi-Newton Methods
- Matrix conditioning and nonlinear optimization
- Multi-step nonlinear conjugate gradient methods for unconstrained minimization
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- New conjugacy conditions and related nonlinear conjugate gradient methods
- New quasi-Newton equation and related methods for unconstrained optimization
- New quasi-Newton methods for unconstrained optimization problems
- Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems
- On the Use of Product Structure in Secant Methods for Nonlinear Least Squares Problems
- On the convergence of the modified Levenberg-Marquardt method with a nonmonotone second order Armijo type line search
- On the quadratic convergence of the Levenberg-Marquardt method without nonsingularity assumption
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Quasi-Newton Methods, Motivation and Theory
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- The conjugate gradient method in extremal problems
- Two new conjugate gradient methods based on modified secant equations
- Two-Point Step Size Gradient Methods
- Variational Methods for Non-Linear Least-Squares
Cited in
(18)- A new algorithm with structured diagonal Hessian approximation for solving nonlinear least squares problems and application to robotic motion control
- Nonlinear least squares and Sobolev gradients
- Chebyshev pseudospectral approximation of two dimensional fractional Schrodinger equation on a convex and rectangular domain
- Structured spectral algorithm with a nonmonotone line search for nonlinear least squares
- Structured diagonal Gauss-Newton method for nonlinear least squares
- Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems
- Scaled conjugate gradient algorithms for unconstrained optimization
- An Accelerated Three-Term Extension of a Descent Nonlinear Conjugate Gradient Method
- Modified Newton integration algorithm with noise suppression for online dynamic nonlinear optimization
- A structured quasi-Newton algorithm with nonmonotone search strategy for structured NLS problems and its application in robotic motion control
- A descent extension of a modified Polak-Ribière-Polyak method with application in image restoration problem
- A new structured spectral conjugate gradient method for nonlinear least squares problems
- Optimal scaling parameters for spectral conjugate gradient methods
- Using a spectral scaling structured BFGS method for constrained nonlinear least squares
- Structured two-point stepsize gradient methods for nonlinear least squares
- On diagonally structured scheme for nonlinear least squares and data-fitting problems
- On the Barzilai–Borwein gradient methods with structured secant equation for nonlinear least squares problems
- Nonstandard Scaling Matrices for Trust Region Gauss–Newton Methods
This page was built for publication: Scaled nonlinear conjugate gradient methods for nonlinear least squares problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2322819)