Stochastic analysis of an adaptive cubic regularization method under inexact gradient evaluations and dynamic Hessian accuracy
DOI10.1080/02331934.2021.1892104zbMath1483.65091arXiv2001.10827OpenAlexW3133851861MaRDI QIDQ5034938
Gianmarco Gurioli, Stefania Bellavia
Publication date: 21 February 2022
Published in: Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2001.10827
worst-case complexity analysisfinite-sum minimizationadaptive cubic regularization methodsinexact derivatives evaluationsstochastic nonconvex optimization
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53)
Related Items (3)
Cites Work
- Unnamed Item
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Complexity bounds for second-order optimality in unconstrained optimization
- Global convergence rate analysis of unconstrained optimization methods based on probabilistic models
- Stochastic optimization using a trust-region method and random models
- Newton-type methods for non-convex optimization under inexact Hessian information
- A note on inexact gradient and Hessian conditions for cubic regularized Newton's method
- Cubic regularization of Newton method and its global performance
- On the use of iterative methods in cubic regularization for unconstrained optimization
- On the Oracle Complexity of First-Order and Derivative-Free Algorithms for Smooth Nonconvex Minimization
- On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
- Optimization Methods for Large-Scale Machine Learning
- An adaptive cubic regularization algorithm for nonconvex optimization with convex constraints and its function-evaluation complexity
- Adaptive cubic regularization methods with dynamic inexact Hessian information and applications to finite-sum minimization
- An investigation of Newton-Sketch and subsampled Newton methods
- A Stochastic Line Search Method with Expected Complexity Analysis
- Iteration and evaluation complexity for the minimization of functions whose computation is intrinsically inexact
- Adaptive Regularization Algorithms with Inexact Evaluations for Nonconvex Optimization
- An Introduction to Matrix Concentration Inequalities
This page was built for publication: Stochastic analysis of an adaptive cubic regularization method under inexact gradient evaluations and dynamic Hessian accuracy