Analysis of the BFGS Method with Errors
From MaRDI portal
Publication:5210518
DOI10.1137/19M1240794zbMath1435.90149arXiv1901.09063OpenAlexW2999729425WikidataQ126384901 ScholiaQ126384901MaRDI QIDQ5210518
Byrd, Richard H., Nocedal, Jorge, Yuchen Xie
Publication date: 21 January 2020
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1901.09063
Nonlinear programming (90C30) Derivative-free methods and methods using generalized derivatives (90C56) Methods of quasi-Newton type (90C53)
Related Items
On the numerical performance of finite-difference-based methods for derivative-free optimization, A trust region method for noisy unconstrained optimization, An overview of stochastic quasi-Newton methods for large-scale machine learning, Adaptive sampling quasi-Newton methods for zeroth-order stochastic optimization, Secant penalized BFGS: a noise robust quasi-Newton method via penalizing the secant condition, Unnamed Item, Structure-preserving deep learning, A Noise-Tolerant Quasi-Newton Algorithm for Unconstrained Optimization
Uses Software
Cites Work
- A Stochastic Quasi-Newton Method for Large-Scale Optimization
- Estimating Computational Noise
- Implicit Filtering
- The Effect of Rounding Errors on Newton-like Methods
- Inaccuracy in quasi-Newton methods: Local improvement theorems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Numerical Optimization
- Superlinear Convergence and Implicit Filtering
- Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods
- A Stochastic Line Search Method with Expected Complexity Analysis
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item