Gradient Descent Finds the Cubic-Regularized Nonconvex Newton Step

From MaRDI portal
Publication:5233102

DOI10.1137/17M1113898zbMath1461.65135arXiv1612.00547OpenAlexW2971617498WikidataQ127320765 ScholiaQ127320765MaRDI QIDQ5233102

Yair Carmon, John C. Duchi

Publication date: 16 September 2019

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1612.00547




Related Items

An improvement of adaptive cubic regularization method for unconstrained optimization problemsUnnamed ItemSmoothness parameter of power of Euclidean normCubic regularization methods with second-order complexity guarantee based on a new subproblem reformulationRiemannian stochastic variance-reduced cubic regularized Newton method for submanifold optimizationA Newton-CG Based Augmented Lagrangian Method for Finding a Second-Order Stationary Point of Nonconvex Equality Constrained Optimization with Complexity GuaranteesRandom Coordinate Descent Methods for Nonseparable Composite OptimizationFirst-Order Methods for Nonconvex Quadratic MinimizationNewton-type methods for non-convex optimization under inexact Hessian informationA Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex OptimizationSecond-Order Guarantees of Distributed Gradient AlgorithmsRecent Theoretical Advances in Non-Convex OptimizationOn the Quadratic Convergence of the Cubic Regularization Method under a Local Error Bound ConditionSolving Large-Scale Cubic Regularization by a Generalized Eigenvalue ProblemComplexity Analysis of Second-Order Line-Search Algorithms for Smooth Nonconvex OptimizationMinimizing uniformly convex functions by cubic regularization of Newton methodAdaptive regularization with cubics on manifoldsDecentralized and parallel primal and dual accelerated methods for stochastic convex programming problemsA concise second-order complexity analysis for unconstrained optimization using high-order regularized modelsError estimates for iterative algorithms for minimizing regularized quadratic subproblemsAn accelerated first-order method with complexity analysis for solving cubic regularization subproblemsAdaptive regularization for nonconvex optimization using inexact function values and randomly perturbed derivativesCombining stochastic adaptive cubic regularization with negative curvature for nonconvex optimizationUnified Acceleration of High-Order Algorithms under General Hölder Continuity


Uses Software


Cites Work


This page was built for publication: Gradient Descent Finds the Cubic-Regularized Nonconvex Newton Step