Optimization of unconstrained functions with sparse Hessian matrices—Quasi-Newton methods
From MaRDI portal
Publication:3659547
DOI10.1007/BF02591769zbMath0513.90072MaRDI QIDQ3659547
Publication date: 1983
Published in: Mathematical Programming (Search for Journal in Brave)
Cholesky factorization; computational experience; positive-definiteness; dense unconstrained optimization; known sparsity pattern; sparse quasi- Newton methods; twice- continuously differentiable function
65K05: Numerical mathematical programming methods
90C30: Nonlinear programming
49M15: Newton-type methods
49M37: Numerical methods based on nonlinear programming
Related Items
Optimization of large-scale complex systems, A sparse sequential quadratic programming algorithm, Optimization of unconstrained functions with sparse hessian matrices-newton-type methods
Cites Work
- Unnamed Item
- An Algorithm for Least-Squares Estimation of Nonlinear Parameters
- The computation of Lagrange-multiplier estimates for constrained minimization
- On Variable-Metric Methods for Sparse Hessians
- On the Superlinear Convergence of an Algorithm for Solving a Sparse Minimization Problem
- On the Estimation of Sparse Hessian Matrices
- A note about sparsity exploiting quasi-Newton updates
- Inexact Newton Methods
- Newton’s Method with a Model Trust Region Modification
- Quasi-Newton Methods, Motivation and Theory
- On Sparse and Symmetric Matrix Updating Subject to a Linear Equation
- Some Numerical Results Using a Sparse Matrix Updating Formula in Unconstrained Optimization
- Methods for Modifying Matrix Factorizations
- A Rapidly Convergent Descent Method for Minimization
- Modification of a Quasi-Newton Method for Nonlinear Equations with a Sparse Jacobian
- A method for the solution of certain non-linear problems in least squares