Minimizing uniformly convex functions by cubic regularization of Newton method
From MaRDI portal
Publication:2032037
DOI10.1007/s10957-021-01838-7zbMath1470.90075arXiv1905.02671OpenAlexW2944447206MaRDI QIDQ2032037
Nikita Doikov, Yu. E. Nesterov
Publication date: 15 June 2021
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1905.02671
Convex programming (90C25) Nonlinear programming (90C30) Newton-type methods (49M15) Numerical methods based on nonlinear programming (49M37) Convexity of real functions of several variables, generalizations (26B25)
Related Items
Tensor methods for finding approximate stationary points of convex functions, Local convergence of tensor methods, Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods, Smoothness parameter of power of Euclidean norm, SCORE: approximating curvature information under self-concordant regularization, Super-Universal Regularized Newton Method, Unnamed Item, High-Order Optimization Methods for Fully Composite Problems
Cites Work
- Unnamed Item
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Lectures on convex optimization
- Accelerating the cubic regularization of Newton's method on convex problems
- Global convergence rate analysis of unconstrained optimization methods based on probabilistic models
- Cubic regularization of Newton method and its global performance
- Evaluation complexity of adaptive cubic regularization methods for convex unconstrained optimization
- Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions
- Finding approximate local minima faster than gradient descent
- Gradient Descent Finds the Cubic-Regularized Nonconvex Newton Step
- Modified Gauss–Newton scheme with worst case guarantees for global performance
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians