Hessian averaging in stochastic Newton methods achieves superlinear convergence
From MaRDI portal
Publication:6165593
DOI10.1007/s10107-022-01913-5arXiv2204.09266MaRDI QIDQ6165593
Michał Dereziński, Sen Na, Michael W. Mahoney
Publication date: 1 August 2023
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2204.09266
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Stochastic programming (90C15) Methods of quasi-Newton type (90C53)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Freedman's inequality for matrix martingales
- User-friendly tail bounds for sums of random matrices
- Sample size selection in optimization methods for machine learning
- Stochastic optimization using a trust-region method and random models
- Sub-sampled Newton methods
- New results on superlinear convergence of classical quasi-Newton methods
- Rates of superlinear convergence for classical quasi-Newton methods
- First-order and stochastic optimization methods for machine learning
- Hybrid Deterministic-Stochastic Methods for Data Fitting
- Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence
- On the Use of Stochastic Hessian Information in Optimization Methods for Machine Learning
- Low-Rank Approximation and Regression in Input Sparsity Time
- Randomized Iterative Methods for Linear Systems
- Optimization Methods for Large-Scale Machine Learning
- High-Dimensional Probability
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- An investigation of Newton-Sketch and subsampled Newton methods
- Convergence of Newton-MR under Inexact Hessian Information
- Globally Convergent Levenberg-Marquardt Method for Phase Retrieval
- Understanding Machine Learning
- Greedy Quasi-Newton Methods with Explicit Superlinear Convergence
- Exact and inexact subsampled Newton methods for optimization
- Subsampled inexact Newton methods for minimizing large sums of convex functions
- An adaptive stochastic sequential quadratic programming with differentiable exact augmented Lagrangians
- Non-asymptotic superlinear convergence of standard quasi-Newton methods
- Inexact Newton-CG algorithms with complexity guarantees
This page was built for publication: Hessian averaging in stochastic Newton methods achieves superlinear convergence