Quasi-Newton methods: superlinear convergence without line searches for self-concordant functions
From MaRDI portal
Publication:4646680
DOI10.1080/10556788.2018.1510927zbMath1409.90091arXiv1612.06965OpenAlexW2563719283MaRDI QIDQ4646680
Publication date: 14 January 2019
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1612.06965
Related Items (11)
Rates of superlinear convergence for classical quasi-Newton methods ⋮ Sketched Newton--Raphson ⋮ Non-asymptotic superlinear convergence of standard quasi-Newton methods ⋮ Optimal step length for the maximal decrease of a self-concordant function by the Newton method ⋮ Extended artificial neural networks approach for solving two-dimensional fractional-order Volterra-type integro-differential equations ⋮ Greedy PSB methods with explicit superlinear convergence ⋮ Composite convex optimization with global and local inexact oracles ⋮ New results on superlinear convergence of classical quasi-Newton methods ⋮ Optimal step length for the Newton method: case of self-concordant functions ⋮ Generalized self-concordant functions: a recipe for Newton-type methods ⋮ Greedy Quasi-Newton Methods with Explicit Superlinear Convergence
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Stochastic Quasi-Newton Method for Large-Scale Optimization
- On the limited memory BFGS method for large scale optimization
- Introductory lectures on convex optimization. A basic course.
- Local convergence analysis for partitioned quasi-Newton updates
- Large-Scale Machine Learning with Stochastic Gradient Descent
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- Updating Quasi-Newton Matrices with Limited Storage
- Block BFGS Methods
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- A Rapidly Convergent Descent Method for Minimization
- Composite Self-Concordant Minimization
- Quasi-Newton Methods and their Application to Function Minimisation
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- Conditioning of Quasi-Newton Methods for Function Minimization
This page was built for publication: Quasi-Newton methods: superlinear convergence without line searches for self-concordant functions