Diagonal approximation of the Hessian by finite differences for unconstrained optimization
From MaRDI portal
Publication:2188948
DOI10.1007/s10957-020-01676-zzbMath1441.49029OpenAlexW3022983955MaRDI QIDQ2188948
Publication date: 15 June 2020
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-020-01676-z
unconstrained optimizationnumerical comparisonscentral differencesforward differencesdiagonal quasi-Newton update
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Newton-type methods (49M15)
Related Items
Accelerated nonlinear finite element method for analysis of isotropic hyperelastic materials nonlinear deformations ⋮ A NONMONOTONE ADMM-BASED DIAGONAL QUASI-NEWTON UPDATE WITH APPLICATION TO THE COMPRESSIVE SENSING PROBLEM ⋮ Unnamed Item ⋮ A class of accelerated subspace minimization conjugate gradient methods
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Some numerical experiments with variable-storage quasi-Newton algorithms
- On the limited memory BFGS method for large scale optimization
- Nonlinear optimization with engineering applications
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- The Newton-Cauchy framework. A unified approach to unconstrained nonlinear minimization
- Linear and nonlinear programming
- Continuous nonlinear optimization for engineering applications in GAMS technology
- A diagonal quasi-Newton updating method for unconstrained optimization
- An acceleration of gradient descent algorithm with backtracking for unconstrained opti\-mi\-za\-tion
- Sizing and Least-Change Secant Methods
- The Effect of Rounding Errors on Newton-like Methods
- Inaccuracy in quasi-Newton methods: Local improvement theorems
- Preconditioning of Truncated-Newton Methods
- A Numerical Study of the Limited Memory BFGS Method and the Truncated-Newton Method for Large Scale Optimization
- Quasi-Newton Methods, Motivation and Theory
- CUTE
- The Quasi-Cauchy Relation and Diagonal Updating
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Benchmarking optimization software with performance profiles.