Cubic regularization methods with second-order complexity guarantee based on a new subproblem reformulation
From MaRDI portal
Publication:2676160
DOI10.1007/s40305-022-00398-5OpenAlexW4224005576MaRDI QIDQ2676160
Zirui Zhou, Rujun Jiang, Zhishuo Zhou
Publication date: 27 September 2022
Published in: Journal of the Operations Research Society of China (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2112.09291
complexity analysisconstrained convex optimizationfirst-order methodscubic regularization subproblem
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30)
Uses Software
Cites Work
- Unnamed Item
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Lectures on convex optimization
- Duallity and sensitivity in nonconvex quadratic optimization over an ellipsoid
- A linear-time algorithm for the trust region subproblem based on hidden convexity
- An accelerated first-order method with complexity analysis for solving cubic regularization subproblems
- Newton-type methods for non-convex optimization under inexact Hessian information
- Implementable tensor methods in unconstrained convex optimization
- A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Adaptive restart for accelerated gradient schemes
- Cubic regularization of Newton method and its global performance
- Two-Point Step Size Gradient Methods
- Estimating the Largest Eigenvalue by the Power and Lanczos Algorithms with a Random Start
- Accelerated Methods for NonConvex Optimization
- Complexity Analysis of Second-Order Line-Search Algorithms for Smooth Nonconvex Optimization
- Finding approximate local minima faster than gradient descent
- First-Order Methods for Nonconvex Quadratic Minimization
- A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization
- Novel Reformulations and Efficient Algorithms for the Generalized Trust Region Subproblem
- Gradient Descent Finds the Cubic-Regularized Nonconvex Newton Step
- A Second-Order Cone Based Approach for Solving the Trust-Region Subproblem and Its Variants
- Trust-Region Newton-CG with Strong Second-Order Complexity Guarantees for Nonconvex Optimization
- On inexact solution of auxiliary problems in tensor methods for convex optimization
- Benchmarking optimization software with performance profiles.