An accelerated first-order method with complexity analysis for solving cubic regularization subproblems
DOI10.1007/S10589-021-00274-7zbMATH Open1473.90116arXiv1911.12545OpenAlexW3144519047MaRDI QIDQ2044484FDOQ2044484
Authors: Rujun Jiang, Man-Chung Yue, Zhishuo Zhou
Publication date: 9 August 2021
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1911.12545
Recommendations
- Cubic regularization methods with second-order complexity guarantee based on a new subproblem reformulation
- Solving Large-Scale Cubic Regularization by a Generalized Eigenvalue Problem
- Solving the cubic regularization model by a nested restarting Lanczos method
- Accelerating the cubic regularization of Newton's method on convex problems
- Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization
first-order methodscomplexity analysisconstrained convex optimizationcubic regularization subproblem
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Benchmarking optimization software with performance profiles.
- Adaptive restart for accelerated gradient schemes
- Title not available (Why is that?)
- Title not available (Why is that?)
- Two-Point Step Size Gradient Methods
- A unified formulation and fast accelerated proximal gradient method for classification
- Estimating the Largest Eigenvalue by the Power and Lanczos Algorithms with a Random Start
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Cubic regularization of Newton method and its global performance
- Duallity and sensitivity in nonconvex quadratic optimization over an ellipsoid
- Complexity analysis of second-order line-search algorithms for smooth nonconvex optimization
- Accelerated methods for nonconvex optimization
- Finding approximate local minima faster than gradient descent
- On the use of iterative methods in cubic regularization for unconstrained optimization
- A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization
- Novel reformulations and efficient algorithms for the generalized trust region subproblem
- A linear-time algorithm for the trust region subproblem based on hidden convexity
- A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property
- Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates
- Gradient descent finds the cubic-regularized nonconvex Newton step
- A second-order cone based approach for solving the trust-region subproblem and its variants
- On the Quadratic Convergence of the Cubic Regularization Method under a Local Error Bound Condition
Cited In (5)
- Scalable adaptive cubic regularization methods
- A filter sequential adaptive cubic regularization algorithm for nonlinear constrained optimization
- Cubic regularization methods with second-order complexity guarantee based on a new subproblem reformulation
- Solving the cubic regularization model by a nested restarting Lanczos method
- A sequential adaptive regularisation using cubics algorithm for solving nonlinear equality constrained optimization
Uses Software
This page was built for publication: An accelerated first-order method with complexity analysis for solving cubic regularization subproblems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2044484)