Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions
From MaRDI portal
Publication:4646444
DOI10.1137/17M1142077zbMath1406.49030OpenAlexW2792215433WikidataQ128620675 ScholiaQ128620675MaRDI QIDQ4646444
Yu. E. Nesterov, Geovani Nunes Grapiglia
Publication date: 14 January 2019
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/17m1142077
Convex programming (90C25) Nonlinear programming (90C30) Newton-type methods (49M15) Numerical methods based on nonlinear programming (49M37) Implicit function theorems; global Newton methods on manifolds (58C15)
Related Items
On the complexity of solving feasibility problems with regularized models, Tensor methods for finding approximate stationary points of convex functions, Local convergence of tensor methods, Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods, A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer, Smoothness parameter of power of Euclidean norm, Super-Universal Regularized Newton Method, Adaptive Third-Order Methods for Composite Convex Optimization, A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization, Contracting Proximal Methods for Smooth Convex Optimization, Minimizing uniformly convex functions by cubic regularization of Newton method, Near-Optimal Hyperfast Second-Order Method for Convex Optimization, Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives, Unified Acceleration of High-Order Algorithms under General Hölder Continuity, A control-theoretic perspective on optimal high-order optimization, On inexact solution of auxiliary problems in tensor methods for convex optimization
Cites Work
- Smooth minimization of non-smooth functions
- Gradient methods for minimizing composite functions
- A trust region algorithm with a worst-case iteration complexity of \(\mathcal{O}(\epsilon ^{-3/2})\) for nonconvex optimization
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Accelerating the cubic regularization of Newton's method on convex problems
- Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization
- On the use of the energy norm in trust-region and adaptive cubic regularization subproblems
- Cubic regularization of Newton method and its global performance
- On High-order Model Regularization for Constrained Optimization
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
- ARCq: a new adaptive regularization by cubics
- The Use of Quadratic Regularization with a Cubic Descent Condition for Unconstrained Optimization
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians