Oracle complexity of second-order methods for smooth convex optimization
From MaRDI portal
Publication:2330652
Abstract: Second-order methods, which utilize gradients as well as Hessians to optimize a given function, are of major importance in mathematical optimization. In this work, we prove tight bounds on the oracle complexity of such methods for smooth convex functions, or equivalently, the worst-case number of iterations required to optimize such functions to a given accuracy. In particular, these bounds indicate when such methods can or cannot improve on gradient-based methods, whose oracle complexity is much better understood. We also provide generalizations of our results to higher-order methods.
Recommendations
- An optimal gradient method for smooth strongly convex minimization
- Superfast second-order methods for unconstrained convex optimization
- Worst-case evaluation complexity and optimality of second-order methods for nonconvex smooth optimization
- On the oracle complexity of smooth strongly convex minimization
- The exact information-based complexity of smooth convex minimization
Cites work
- scientific article; zbMATH DE number 3850830 (Why is no real title available?)
- scientific article; zbMATH DE number 3790208 (Why is no real title available?)
- scientific article; zbMATH DE number 729680 (Why is no real title available?)
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- scientific article; zbMATH DE number 3052543 (Why is no real title available?)
- Accelerating the cubic regularization of Newton's method on convex problems
- An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods
- Cubic regularization of Newton method and its global performance
- Evaluation complexity of adaptive cubic regularization methods for convex unconstrained optimization
- Introductory lectures on convex optimization. A basic course.
- On the complexity of steepest descent, Newton's and regularized Newton's methods for nonconvex unconstrained optimization problems
- On uniformly convex functionals
- Successive Rank-One Approximations for Nearly Orthogonally Decomposable Symmetric Tensors
Cited in
(33)- An optimal high-order tensor method for convex optimization
- Super-Universal Regularized Newton Method
- Higher-order methods for convex-concave min-max optimization and monotone variational inequalities
- Contracting proximal methods for smooth convex optimization
- Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods
- On the oracle complexity of smooth strongly convex minimization
- Second-order stochastic optimization for machine learning in linear time
- Inexact tensor methods and their application to stochastic convex optimization
- OFFO minimization algorithms for second-order optimality and their complexity
- On the oracle complexity of first-order and derivative-free algorithms for smooth nonconvex minimization
- Utilizing second order information in minibatch stochastic variance reduced proximal iterations
- Lower bounds for finding stationary points I
- On lower iteration complexity bounds for the convex concave saddle point problems
- Lower bounds for finding stationary points II: first-order methods
- High-order optimization methods for fully composite problems
- A search-free \(O(1/k^{3/2})\) homotopy inexact proximal-Newton extragradient algorithm for monotone variational inequalities
- Superfast second-order methods for unconstrained convex optimization
- Implementable tensor methods in unconstrained convex optimization
- Tensor methods for minimizing convex functions with Hölder continuous higher-order derivatives
- High-order methods beyond the classical complexity bounds: inexact high-order proximal-point methods
- A simple method for convex optimization in the oracle model
- A control-theoretic perspective on optimal high-order optimization
- Near-optimal tensor methods for minimizing the gradient norm of convex functions and accelerated primal–dual tensor methods
- scientific article; zbMATH DE number 7626728 (Why is no real title available?)
- Affine-invariant contracting-point methods for convex optimization
- Complexity bounds for second-order optimality in unconstrained optimization
- A unified adaptive tensor approximation scheme to accelerate composite convex optimization
- Inexact high-order proximal-point methods with auxiliary search procedure
- SketchySGD: reliable stochastic optimization via randomized curvature estimates
- Perseus: a simple and optimal high-order method for variational inequalities
- Unified acceleration of high-order algorithms under general Hölder continuity
- Regularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) Convergence
- An accelerated regularized Chebyshev-Halley method for unconstrained optimization
This page was built for publication: Oracle complexity of second-order methods for smooth convex optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2330652)