Greedy PSB methods with explicit superlinear convergence
From MaRDI portal
Publication:6175468
DOI10.1007/s10589-023-00495-yOpenAlexW4379618111MaRDI QIDQ6175468
Publication date: 24 July 2023
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-023-00495-y
Related Items
Cites Work
- Unnamed Item
- Convergence of quasi-Newton matrices generated by the symmetric rank one update
- The BFGS method with exact line searches fails for non-convex objective functions
- A perfect example for the BFGS method
- New results on superlinear convergence of classical quasi-Newton methods
- Rates of superlinear convergence for classical quasi-Newton methods
- Optimization theory and methods. Nonlinear programming
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- On the Behavior of Broyden’s Class of Quasi-Newton Methods
- Variable Metric Method for Minimization
- Local and Superlinear Convergence for Partially Known Quasi-Newton Methods
- On the Global Convergence of Broyden's Method
- Numerical Optimization
- LOCAL AND SUPERLINEAR CONVERGENCE OF STRUCTURED QUASI-NEWTON METHODS FOR NONLINEAR OPTIMIZATION
- Quasi-Newton methods: superlinear convergence without line searches for self-concordant functions
- Analysis of a Symmetric Rank-One Trust Region Method
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- Convergence Properties of the BFGS Algoritm
- On the convergence of a wide range of trust region methods for unconstrained optimization
- A Rapidly Convergent Descent Method for Minimization
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- On the Convergence of the Variable Metric Algorithm
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Conditioning of Quasi-Newton Methods for Function Minimization
- A New Algorithm for Unconstrained Optimization
- Quasi-newton algorithms generate identical points
- IQN: An Incremental Quasi-Newton Method with Local Superlinear Convergence Rate
- Greedy Quasi-Newton Methods with Explicit Superlinear Convergence