Gradient Descent Finds the Cubic-Regularized Nonconvex Newton Step
DOI10.1137/17M1113898zbMath1461.65135arXiv1612.00547OpenAlexW2971617498WikidataQ127320765 ScholiaQ127320765MaRDI QIDQ5233102
Publication date: 16 September 2019
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1612.00547
global optimizationNewton's methodpower methodgradient descenttrust region methodscubic regularizationnonasymptotic rate of convergencenonconvex quadratics
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Quadratic programming (90C20)
Related Items
Uses Software
Cites Work
- A linear-time algorithm for trust region problems
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Complexity bounds for second-order optimality in unconstrained optimization
- On solving trust-region and other regularised subproblems in optimization
- Introductory lectures on convex optimization. A basic course.
- Cubic regularization of Newton method and its global performance
- On the use of iterative methods in cubic regularization for unconstrained optimization
- Fast Curvature Matrix-Vector Products for Second-Order Gradient Descent
- A Subspace Minimization Method for the Trust-Region Step
- Estimating the Largest Eigenvalue by the Power and Lanczos Algorithms with a Random Start
- A D.C. Optimization Algorithm for Solving the Trust-Region Subproblem
- Trust Region Methods
- Globally Solving the Trust Region Subproblem Using Simple First-Order Methods
- Accelerated Methods for NonConvex Optimization
- Optimization Methods for Large-Scale Machine Learning
- Solving the Trust-Region Subproblem using the Lanczos Method
- Finding approximate local minima faster than gradient descent
- Tight query complexity lower bounds for PCA via finite sample deformed wigner law
- A Second-Order Cone Based Approach for Solving the Trust-Region Subproblem and Its Variants
- On the Generalized Lanczos Trust-Region Method
- Affine conjugate adaptive Newton methods for nonlinear elastomechanics
- Some methods of speeding up the convergence of iteration methods
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Gradient Descent Finds the Cubic-Regularized Nonconvex Newton Step