The incremental Gauss-Newton algorithm with adaptive stepsize rule
DOI10.1023/A:1025703629626zbMATH Open1081.90050OpenAlexW1562505150MaRDI QIDQ1415481FDOQ1415481
Authors: Hiroyuki Moriyama, Nobuo Yamashita, Masao Fukushima
Publication date: 4 December 2003
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1023/a:1025703629626
Recommendations
- Incremental Least Squares Methods and the Extended Kalman Filter
- A recursive algorithm for nonlinear least-squares problems
- Optimal recursive iterative algorithm for discrete nonlinear least-squares estimation
- The Iterated Kalman Smoother as a Gauss–Newton Method
- Convergence analysis of the general Gauss-Newton algorithm
Numerical optimization and variational techniques (65K10) Convex programming (90C25) Numerical solutions to overdetermined systems, pseudoinverses (65F20)
Cited In (9)
- Convergence acceleration of ensemble Kalman inversion in nonlinear settings
- Incremental subgradient algorithms with dynamic step sizes for separable convex optimizations
- Incremental Least Squares Methods and the Extended Kalman Filter
- A globally convergent incremental Newton method
- MASAGE: model-agnostic sequential and adaptive game estimation
- On the Convergence Rate of Incremental Aggregated Gradient Algorithms
- A recursive algorithm for nonlinear least-squares problems
- Convergence rate of incremental gradient and incremental Newton methods
- Estimation of Lévy processes via stochastic programming and Kalman filtering
This page was built for publication: The incremental Gauss-Newton algorithm with adaptive stepsize rule
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1415481)