Limited-memory BFGS with displacement aggregation
DOI10.1007/s10107-021-01621-6zbMath1492.65161arXiv1903.03471OpenAlexW3128577503MaRDI QIDQ2149548
Frank E. Curtis, Baoyu Zhou, Albert S. Berahas
Publication date: 29 June 2022
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1903.03471
nonlinear optimizationsuperlinear convergencelimited-memory BFGSBroyden-Fletcher-Goldfarb-Shanno methodquasi-Newton algorithms
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical optimization and variational techniques (65K10) Methods of quasi-Newton type (90C53) Numerical methods based on nonlinear programming (49M37)
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Stochastic Quasi-Newton Method for Large-Scale Optimization
- Nonsmooth optimization via quasi-Newton methods
- A quasi-Newton algorithm for nonconvex, nonsmooth optimization with global convergence guarantees
- Local and superlinear convergence of a class of variable metric methods
- Representations of quasi-Newton matrices and their use in limited memory methods
- A family of variable metric proximal methods
- A numerical study of limited memory BFGS methods
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Global Convergence of Online Limited Memory BFGS
- An adaptive gradient sampling algorithm for non-smooth optimization
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Global and superlinear convergence of a class of variable metric methods
- Updating Quasi-Newton Matrices with Limited Storage
- Variable Metric Method for Minimization
- BFGS with Update Skipping and Varying Memory
- Quasi-Newton Bundle-Type Methods for Nondifferentiable Convex Optimization
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- Methods for Modifying Matrix Factorizations
- A robust multi-batch L-BFGS method for machine learning
- A self-correcting variable-metric algorithm framework for nonsmooth optimization
- Adaptive, Limited-Memory BFGS Algorithms for Unconstrained Optimization
- New limited memory bundle method for large-scale nonsmooth optimization
- A Family of Variable-Metric Methods Derived by Variational Means
- Variable metric methods of minimisation
- A new approach to variable metric algorithms
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Conditioning of Quasi-Newton Methods for Function Minimization
- Stochastic Quasi-Newton Methods for Nonconvex Stochastic Optimization
- Gobally convergent variable metric method for nonconvex nondifferentiable unconstrained minimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: Limited-memory BFGS with displacement aggregation