On the quadratic convergence of the cubic regularization method under a local error bound condition
DOI10.1137/18M1167498zbMATH Open1411.90284arXiv1801.09387OpenAlexW2963498484WikidataQ128143319 ScholiaQ128143319MaRDI QIDQ4634142FDOQ4634142
Authors: Man-Chung Yue, Zirui Zhou, Anthony Man-Cho So
Publication date: 7 May 2019
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1801.09387
Recommendations
- A note on inexact gradient and Hessian conditions for cubic regularized Newton's method
- Cubic regularization of Newton method and its global performance
- Accelerating the cubic regularization of Newton's method on convex problems
- On global minimizers of quadratic functions with cubic regularization
- The use of quadratic regularization with a cubic descent condition for unconstrained optimization
error boundlow-rank matrix recoverylocal quadratic convergencephase retrievalcubic regularizationnonisolated solutionssecond-order critical points
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Newton-type methods (49M15)
Cites Work
- Phase retrieval via Wirtinger flow: theory and algorithms
- Variational Analysis
- Title not available (Why is that?)
- An Algorithm for Least-Squares Estimation of Nonlinear Parameters
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- A method for the solution of certain non-linear problems in least squares
- Local behavior of an iterative framework for generalized equations with nonisolated solutions
- Title not available (Why is that?)
- On the complexity of steepest descent, Newton's and regularized Newton's methods for nonconvex unconstrained optimization problems
- Ordinary differential equations.
- Nonlinear stepsize control, trust regions and regularizations for unconstrained optimization
- Strong local convergence properties of adaptive regularized methods for nonlinear least squares
- The restricted isometry property and its implications for compressed sensing
- Error bounds and convergence analysis of feasible descent methods: A general approach
- Recent advances in trust region algorithms
- Title not available (Why is that?)
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Characterization of metric regularity of subdifferentials
- On the quadratic convergence of the Levenberg-Marquardt method without nonsingularity assumption
- Cubic regularization of Newton method and its global performance
- Second-order growth, tilt stability, and metric regularity of the subdifferential
- Regularized Newton methods for convex minimization problems with singular solutions
- Title not available (Why is that?)
- From error bounds to the complexity of first-order descent methods for convex functions
- A unified approach to error bounds for structured convex optimization problems
- Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods
- A geometric analysis of phase retrieval
- A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property
- Gradient descent finds the cubic-regularized nonconvex Newton step
- Nonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteria
Cited In (9)
- A note on inexact gradient and Hessian conditions for cubic regularized Newton's method
- GNMR: a provable one-line algorithm for low rank matrix recovery
- On high-order multilevel optimization strategies
- Cubic regularized Newton method for the saddle point models: a global and local convergence analysis
- A Bregman forward-backward linesearch algorithm for nonconvex composite optimization: superlinear convergence to nonisolated local minima
- An accelerated first-order method with complexity analysis for solving cubic regularization subproblems
- Cubic regularization methods with second-order complexity guarantee based on a new subproblem reformulation
- A unified approach to synchronization problems over subgroups of the orthogonal group
- Nonconvex Robust Low-Rank Matrix Recovery
This page was built for publication: On the quadratic convergence of the cubic regularization method under a local error bound condition
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4634142)