Super-Universal Regularized Newton Method
DOI10.1137/22M1519444arXiv2208.05888OpenAlexW4390543603MaRDI QIDQ6136654FDOQ6136654
Authors: Nikita Doikov, Konstantin Mishchenko, Yuri Nesterov
Publication date: 17 January 2024
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2208.05888
Recommendations
- Minimizing uniformly convex functions by cubic regularization of Newton method
- Regularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) Convergence
- Gradient regularization of Newton method with Bregman distances
- Accelerated regularized Newton methods for minimizing composite convex functions
- Accelerating the cubic regularization of Newton's method on convex problems
regularizationconvex optimizationglobal convergenceNewton methodglobal complexity boundsuniversal methods
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53)
Cites Work
- Title not available (Why is that?)
- An Algorithm for Least-Squares Estimation of Nonlinear Parameters
- A method for the solution of certain non-linear problems in least squares
- An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods
- Trust Region Methods
- Iterative Solution of Nonlinear Equations in Several Variables
- Title not available (Why is that?)
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Newton's method and its use in optimization
- Cubic regularization of Newton method and its global performance
- Accelerating the cubic regularization of Newton's method on convex problems
- Regularized Newton method for unconstrained convex optimization
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- A regularized Newton method without line search for unconstrained optimization
- Maximization by Quadratic Hill-Climbing
- Oracle complexity of second-order methods for smooth convex optimization
- Lectures on convex optimization
- Relatively smooth convex optimization by first-order methods, and applications
- A descent lemma beyond Lipschitz gradient continuity: first-order methods revisited and applications
- Tensor methods for minimizing convex functions with Hölder continuous higher-order derivatives
- Near-optimal hyperfast second-order method for convex optimization
- Regularized Newton methods for minimizing functions with Hölder continuous hessians
- Sharp worst-case evaluation complexity bounds for arbitrary-order nonconvex optimization with inexpensive constraints
- Minimizing uniformly convex functions by cubic regularization of Newton method
- Accelerated regularized Newton methods for minimizing composite convex functions
- Implementable tensor methods in unconstrained convex optimization
- Contracting proximal methods for smooth convex optimization
- Local convergence of tensor methods
- Superfast second-order methods for unconstrained convex optimization
- On inexact solution of auxiliary problems in tensor methods for convex optimization
- Affine-invariant contracting-point methods for convex optimization
- Inexact accelerated high-order proximal-point methods
Cited In (1)
This page was built for publication: Super-Universal Regularized Newton Method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6136654)