An accelerated first-order method with complexity analysis for solving cubic regularization subproblems
From MaRDI portal
Publication:2044484
Abstract: We propose a first-order method to solve the cubic regularization subproblem (CRS) based on a novel reformulation. The reformulation is a constrained convex optimization problem whose feasible region admits an easily computable projection. Our reformulation requires computing the minimum eigenvalue of the Hessian. To avoid the expensive computation of the exact minimum eigenvalue, we develop a surrogate problem to the reformulation where the exact minimum eigenvalue is replaced with an approximate one. We then apply first-order methods such as the Nesterov's accelerated projected gradient method (APG) and projected Barzilai-Borwein method to solve the surrogate problem. As our main theoretical contribution, we show that when an -approximate minimum eigenvalue is computed by the Lanczos method and the surrogate problem is approximately solved by APG, our approach returns an -approximate solution to CRS in matrix-vector multiplications (where hides the logarithmic factors). Numerical experiments show that our methods are comparable to and outperform the Krylov subspace method in the easy and hard cases, respectively. We further implement our methods as subproblem solvers of adaptive cubic regularization methods, and numerical results show that our algorithms are comparable to the state-of-the-art algorithms.
Recommendations
- Cubic regularization methods with second-order complexity guarantee based on a new subproblem reformulation
- Solving Large-Scale Cubic Regularization by a Generalized Eigenvalue Problem
- Solving the cubic regularization model by a nested restarting Lanczos method
- Accelerating the cubic regularization of Newton's method on convex problems
- Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization
Cites work
- scientific article; zbMATH DE number 3850830 (Why is no real title available?)
- scientific article; zbMATH DE number 6159604 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization
- A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property
- A linear-time algorithm for the trust region subproblem based on hidden convexity
- A second-order cone based approach for solving the trust-region subproblem and its variants
- A unified formulation and fast accelerated proximal gradient method for classification
- Accelerated methods for nonconvex optimization
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive restart for accelerated gradient schemes
- Benchmarking optimization software with performance profiles.
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Complexity analysis of second-order line-search algorithms for smooth nonconvex optimization
- Cubic regularization of Newton method and its global performance
- Duallity and sensitivity in nonconvex quadratic optimization over an ellipsoid
- Estimating the Largest Eigenvalue by the Power and Lanczos Algorithms with a Random Start
- Finding approximate local minima faster than gradient descent
- Gradient descent finds the cubic-regularized nonconvex Newton step
- Novel reformulations and efficient algorithms for the generalized trust region subproblem
- On the quadratic convergence of the cubic regularization method under a local error bound condition
- On the use of iterative methods in cubic regularization for unconstrained optimization
- Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates
- Two-Point Step Size Gradient Methods
Cited in
(5)- Scalable adaptive cubic regularization methods
- A filter sequential adaptive cubic regularization algorithm for nonlinear constrained optimization
- Cubic regularization methods with second-order complexity guarantee based on a new subproblem reformulation
- Solving the cubic regularization model by a nested restarting Lanczos method
- A sequential adaptive regularisation using cubics algorithm for solving nonlinear equality constrained optimization
This page was built for publication: An accelerated first-order method with complexity analysis for solving cubic regularization subproblems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2044484)