On the local convergence of a derivative-free algorithm for least-squares minimization
From MaRDI portal
Publication:429479
DOI10.1007/s10589-010-9367-xzbMath1268.90043OpenAlexW2160202583MaRDI QIDQ429479
Andrew R. Conn, Hongchao Zhang
Publication date: 19 June 2012
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-010-9367-x
system of nonlinear equationsleast-squareslocal convergencederivative-free optimizationLevenberg-Marquardt methodtrust regionasymptotic convergence
Derivative-free methods and methods using generalized derivatives (90C56) Quadratic programming (90C20)
Related Items
A class of derivative-free trust-region methods with interior backtracking technique for nonlinear optimization problems subject to linear inequality constraints, On the Barzilai–Borwein gradient methods with structured secant equation for nonlinear least squares problems, Local analysis of a spectral correction for the Gauss-Newton model applied to quadratic residual problems, A Stochastic Levenberg--Marquardt Method Using Random Models with Complexity Results, Convergence and evaluation-complexity analysis of a regularized tensor-Newton method for solving nonlinear least-squares problems, A derivative-free trust region algorithm with nonmonotone filter technique for bound constrained optimization, Unnamed Item, An affine scaling derivative-free trust region method with interior backtracking technique for bounded-constrained nonlinear programming, An inexact derivative-free Levenberg-Marquardt method for linear inequality constrained nonlinear systems under local error bound conditions, Conjugate gradient path method without line search technique for derivative-free unconstrained optimization, A brief survey of methods for solving nonlinear least-squares problems, A derivative-free Gauss-Newton method, Derivative-free optimization methods, Combined gradient methods for multiobjective optimization, Recent advances in trust region algorithms, An affine-scaling derivative-free trust-region method for solving nonlinear systems subject to linear inequality constraints, Accelerated derivative-free nonlinear least-squares applied to the estimation of Manning coefficients, An interior affine scaling cubic regularization algorithm for derivative-free optimization subject to bound constraints
Uses Software
Cites Work
- Test examples for nonlinear programming codes
- On trust region methods for unconstrained minimization without derivatives
- On the quadratic convergence of the Levenberg-Marquardt method without nonsingularity assumption
- Least Frobenius norm updating of quadratic models that satisfy interpolation conditions
- Geometry of interpolation sets in derivative free optimization
- Self-adaptive inexact proximal point methods
- Convergence properties of a self-adaptive Levenberg-Marquardt algorithm under local error bound condition
- The Proximal Point Algorithm with Genuine Superlinear Convergence for the Monotone Complementarity Problem
- A Derivative-Free Algorithm for Least-Squares Minimization
- Asymptotic Convergence Analysis of a New Class of Proximal Point Methods
- Developments of NEWUOA for minimization without derivatives
- Geometry of sample sets in derivative-free optimization: polynomial regression and underdetermined interpolation
- Introduction to Derivative-Free Optimization
- Trust Region Methods
- Convergence Properties of the Inexact Levenberg-Marquardt Method under Local Error Bound Conditions
- Global Convergence of General Derivative-Free Trust-Region Algorithms to First- and Second-Order Critical Points
- Unnamed Item
- Unnamed Item
- Unnamed Item