A Derivative-Free Nonlinear Least Squares Solver
From MaRDI portal
Publication:6090756
DOI10.1007/978-3-031-22990-9_1MaRDI QIDQ6090756
Publication date: 17 November 2023
Published in: Communications in Computer and Information Science (Search for Journal in Brave)
derivative-free optimizationnonlinear least squarespreconditioned subspace descentpseudorandom preconditioning
Cites Work
- Randomized preprocessing of homogeneous linear systems of equations
- Minimization of functions having Lipschitz continuous first partial derivatives
- Use of preconditioned Krylov subspaces in conjugate gradient-type methods for solving a nonlinear least squares problem
- Structure of the Hessian matrix and an economical implementation of Newton’s method in the problem of canonical approximation of tensors
- Hybrid Krylov Methods for Nonlinear Systems of Equations
- Testing Unconstrained Optimization Software
- Some Numerical Results Using a Sparse Matrix Updating Formula in Unconstrained Optimization
- Convergence Theory of Nonlinear Newton–Krylov Algorithms
- On a Class of Nonlinear Equation Solvers Based on the Residual Norm Reduction over a Sequence of Affine Subspaces
- A Local Convergence Theory for Combined Inexact-Newton/Finite-Difference Projection Methods
- An Adaptive Algebraic Multigrid Algorithm for Low-Rank Canonical Tensor Decomposition
- Minimization methods for approximating tensors and their comparison
This page was built for publication: A Derivative-Free Nonlinear Least Squares Solver