A regularized limited memory BFGS method for nonconvex unconstrained minimization
From MaRDI portal
Publication:2248965
DOI10.1007/s11075-013-9706-yzbMath1291.65175OpenAlexW1966654169MaRDI QIDQ2248965
Publication date: 27 June 2014
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-013-9706-y
Numerical mathematical programming methods (65K05) Numerical optimization and variational techniques (65K10)
Related Items (2)
A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems ⋮ A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization
Uses Software
Cites Work
- On the convergence property of the DFP algorithm
- A compact limited memory method for large scale unconstrained optimization
- On the limited memory BFGS method for large scale optimization
- A limited memory BFGS-type method for large-scale unconstrained optimization
- The BFGS method with exact line searches fails for non-convex objective functions
- On the convergence of the DFP algorithm for unconstrained optimization when there are only two variables
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- A numerical study of limited memory BFGS methods
- Improved Hessian approximations for the limited memory BFGS method
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Convergence of the BFGS Method for $LC^1 $ Convex Constrained Optimization
- Testing Unconstrained Optimization Software
- Updating Quasi-Newton Matrices with Limited Storage
- A Numerical Study of the Limited Memory BFGS Method and the Truncated-Newton Method for Large Scale Optimization
- Quasi-Newton Methods, Motivation and Theory
- Numerical Optimization
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- Convergence Properties of the BFGS Algoritm
- Quadratic Termination Properties of Minimization Algorithms I. Statement and Discussion of Results
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
- Global convergence of a regularized factorized quasi-Newton method for nonlinear least squares problems
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: A regularized limited memory BFGS method for nonconvex unconstrained minimization