Optimization of unconstrained functions with sparse Hessian matrices—Quasi-Newton methods
Publication:3659547
DOI10.1007/BF02591769zbMath0513.90072OpenAlexW2089895988MaRDI QIDQ3659547
Publication date: 1983
Published in: Mathematical Programming (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf02591769
Cholesky factorizationcomputational experiencepositive-definitenessdense unconstrained optimizationknown sparsity patternsparse quasi- Newton methodstwice- continuously differentiable function
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Newton-type methods (49M15) Numerical methods based on nonlinear programming (49M37)
Related Items (3)
Cites Work
- Unnamed Item
- An Algorithm for Least-Squares Estimation of Nonlinear Parameters
- The computation of Lagrange-multiplier estimates for constrained minimization
- On Variable-Metric Methods for Sparse Hessians
- On the Superlinear Convergence of an Algorithm for Solving a Sparse Minimization Problem
- On the Estimation of Sparse Hessian Matrices
- A note about sparsity exploiting quasi-Newton updates
- Inexact Newton Methods
- Newton’s Method with a Model Trust Region Modification
- Quasi-Newton Methods, Motivation and Theory
- On Sparse and Symmetric Matrix Updating Subject to a Linear Equation
- Some Numerical Results Using a Sparse Matrix Updating Formula in Unconstrained Optimization
- Methods for Modifying Matrix Factorizations
- A Rapidly Convergent Descent Method for Minimization
- Modification of a Quasi-Newton Method for Nonlinear Equations with a Sparse Jacobian
- A method for the solution of certain non-linear problems in least squares
This page was built for publication: Optimization of unconstrained functions with sparse Hessian matrices—Quasi-Newton methods