Super-Universal Regularized Newton Method
From MaRDI portal
Publication:6136654
DOI10.1137/22m1519444arXiv2208.05888OpenAlexW4390543603MaRDI QIDQ6136654
Konstantin Mishchenko, Yu. E. Nesterov, Nikita Doikov
Publication date: 17 January 2024
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2208.05888
global convergenceconvex optimizationregularizationNewton methodglobal complexity boundsuniversal methods
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53)
Cites Work
- Unnamed Item
- Unnamed Item
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Lectures on convex optimization
- A regularized Newton method without line search for unconstrained optimization
- Newton's method and its use in optimization
- Accelerating the cubic regularization of Newton's method on convex problems
- Regularized Newton method for unconstrained convex optimization
- Minimizing uniformly convex functions by cubic regularization of Newton method
- Local convergence of tensor methods
- Implementable tensor methods in unconstrained convex optimization
- Oracle complexity of second-order methods for smooth convex optimization
- Cubic regularization of Newton method and its global performance
- Superfast second-order methods for unconstrained convex optimization
- Affine-invariant contracting-point methods for convex optimization
- Inexact accelerated high-order proximal-point methods
- An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods
- An Algorithm for Least-Squares Estimation of Nonlinear Parameters
- Trust Region Methods
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions
- Iterative Solution of Nonlinear Equations in Several Variables
- Near-Optimal Hyperfast Second-Order Method for Convex Optimization
- Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives
- Contracting Proximal Methods for Smooth Convex Optimization
- Sharp Worst-Case Evaluation Complexity Bounds for Arbitrary-Order Nonconvex Optimization with Inexpensive Constraints
- Maximization by Quadratic Hill-Climbing
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- A method for the solution of certain non-linear problems in least squares
- On inexact solution of auxiliary problems in tensor methods for convex optimization