An accelerated first-order method with complexity analysis for solving cubic regularization subproblems
From MaRDI portal
Publication:2044484
DOI10.1007/S10589-021-00274-7zbMath1473.90116arXiv1911.12545OpenAlexW3144519047MaRDI QIDQ2044484
Man-Chung Yue, Zhishuo Zhou, Rujun Jiang
Publication date: 9 August 2021
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1911.12545
complexity analysisconstrained convex optimizationfirst-order methodscubic regularization subproblem
Related Items (3)
Cubic regularization methods with second-order complexity guarantee based on a new subproblem reformulation ⋮ A filter sequential adaptive cubic regularization algorithm for nonlinear constrained optimization ⋮ A sequential adaptive regularisation using cubics algorithm for solving nonlinear equality constrained optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Duallity and sensitivity in nonconvex quadratic optimization over an ellipsoid
- A linear-time algorithm for the trust region subproblem based on hidden convexity
- A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property
- Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Adaptive restart for accelerated gradient schemes
- A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization
- Cubic regularization of Newton method and its global performance
- On the use of iterative methods in cubic regularization for unconstrained optimization
- Two-Point Step Size Gradient Methods
- Estimating the Largest Eigenvalue by the Power and Lanczos Algorithms with a Random Start
- Accelerated Methods for NonConvex Optimization
- On the Quadratic Convergence of the Cubic Regularization Method under a Local Error Bound Condition
- Complexity Analysis of Second-Order Line-Search Algorithms for Smooth Nonconvex Optimization
- Finding approximate local minima faster than gradient descent
- Novel Reformulations and Efficient Algorithms for the Generalized Trust Region Subproblem
- Gradient Descent Finds the Cubic-Regularized Nonconvex Newton Step
- A Second-Order Cone Based Approach for Solving the Trust-Region Subproblem and Its Variants
- Benchmarking optimization software with performance profiles.
This page was built for publication: An accelerated first-order method with complexity analysis for solving cubic regularization subproblems