On inexact solution of auxiliary problems in tensor methods for convex optimization
From MaRDI portal
Publication:5859013
DOI10.1080/10556788.2020.1731749zbMath1464.90057arXiv1907.13023OpenAlexW3020523222MaRDI QIDQ5859013
Geovani Nunes Grapiglia, Yu. E. Nesterov
Publication date: 15 April 2021
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1907.13023
unconstrained minimizationhigh-order methodsHölder conditiontensor methodsworst-case global complexity bounds
Convex programming (90C25) Nonlinear programming (90C30) Newton-type methods (49M15) Numerical methods based on nonlinear programming (49M37) Implicit function theorems; global Newton methods on manifolds (58C15)
Related Items
Tensor methods for finding approximate stationary points of convex functions, Inexact basic tensor methods for some classes of convex optimization problems, Superfast second-order methods for unconstrained convex optimization, Cubic regularization methods with second-order complexity guarantee based on a new subproblem reformulation, Super-Universal Regularized Newton Method, Adaptive Third-Order Methods for Composite Convex Optimization, Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives
Cites Work
- Unnamed Item
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Primal-dual first-order methods with \({\mathcal {O}(1/\varepsilon)}\) iteration-complexity for cone programming
- Accelerating the cubic regularization of Newton's method on convex problems
- Cubic regularization of Newton method and its global performance
- Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
- First Order Methods Beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems
- On High-order Model Regularization for Constrained Optimization
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
- Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions
- Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications