A note on inexact gradient and Hessian conditions for cubic regularized Newton's method
DOI10.1016/J.ORL.2019.01.009zbMATH Open1476.90352OpenAlexW2912642813WikidataQ128411331 ScholiaQ128411331MaRDI QIDQ2294286FDOQ2294286
Authors: Yanyan Li
Publication date: 10 February 2020
Published in: Operations Research Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.orl.2019.01.009
Recommendations
- On the quadratic convergence of the cubic regularization method under a local error bound condition
- Cubic regularization of Newton method and its global performance
- A cubic regularization of Newton's method with finite difference Hessian approximations
- Accelerating the cubic regularization of Newton's method on convex problems
- Minimizing uniformly convex functions by cubic regularization of Newton method
Numerical optimization and variational techniques (65K10) Nonconvex programming, global optimization (90C26) Methods of quasi-Newton type (90C53) Newton-type methods (49M15)
Cites Work
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Cubic regularization of Newton method and its global performance
- An adaptive cubic regularization algorithm for nonconvex optimization with convex constraints and its function-evaluation complexity
- Complexity bounds for second-order optimality in unconstrained optimization
Cited In (3)
This page was built for publication: A note on inexact gradient and Hessian conditions for cubic regularized Newton's method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2294286)