Oracle complexity of second-order methods for smooth convex optimization
From MaRDI portal
Publication:2330652
DOI10.1007/S10107-018-1293-1zbMATH Open1430.90460arXiv1705.07260OpenAlexW2962824898MaRDI QIDQ2330652FDOQ2330652
Authors: Yossi Arjevani, Ohad Shamir, Ron Shiff
Publication date: 22 October 2019
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Abstract: Second-order methods, which utilize gradients as well as Hessians to optimize a given function, are of major importance in mathematical optimization. In this work, we prove tight bounds on the oracle complexity of such methods for smooth convex functions, or equivalently, the worst-case number of iterations required to optimize such functions to a given accuracy. In particular, these bounds indicate when such methods can or cannot improve on gradient-based methods, whose oracle complexity is much better understood. We also provide generalizations of our results to higher-order methods.
Full work available at URL: https://arxiv.org/abs/1705.07260
Recommendations
- An optimal gradient method for smooth strongly convex minimization
- Superfast second-order methods for unconstrained convex optimization
- Worst-case evaluation complexity and optimality of second-order methods for nonconvex smooth optimization
- On the oracle complexity of smooth strongly convex minimization
- The exact information-based complexity of smooth convex minimization
Numerical mathematical programming methods (65K05) Convex programming (90C25) Numerical methods based on nonlinear programming (49M37)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Introductory lectures on convex optimization. A basic course.
- Title not available (Why is that?)
- Title not available (Why is that?)
- An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods
- On the complexity of steepest descent, Newton's and regularized Newton's methods for nonconvex unconstrained optimization problems
- Title not available (Why is that?)
- Cubic regularization of Newton method and its global performance
- Evaluation complexity of adaptive cubic regularization methods for convex unconstrained optimization
- Accelerating the cubic regularization of Newton's method on convex problems
- On uniformly convex functionals
- Successive Rank-One Approximations for Nearly Orthogonally Decomposable Symmetric Tensors
Cited In (33)
- Title not available (Why is that?)
- Super-Universal Regularized Newton Method
- Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods
- Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives
- On the oracle complexity of smooth strongly convex minimization
- Inexact tensor methods and their application to stochastic convex optimization
- High-Order Optimization Methods for Fully Composite Problems
- OFFO minimization algorithms for second-order optimality and their complexity
- On the oracle complexity of first-order and derivative-free algorithms for smooth nonconvex minimization
- Inexact High-Order Proximal-Point Methods with Auxiliary Search Procedure
- A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization
- Lower bounds for finding stationary points I
- On lower iteration complexity bounds for the convex concave saddle point problems
- Unified Acceleration of High-Order Algorithms under General Hölder Continuity
- Lower bounds for finding stationary points II: first-order methods
- A search-free \(O(1/k^{3/2})\) homotopy inexact proximal-Newton extragradient algorithm for monotone variational inequalities
- Superfast second-order methods for unconstrained convex optimization
- Implementable tensor methods in unconstrained convex optimization
- High-order methods beyond the classical complexity bounds: inexact high-order proximal-point methods
- A simple method for convex optimization in the oracle model
- An Optimal High-Order Tensor Method for Convex Optimization
- A control-theoretic perspective on optimal high-order optimization
- Near-optimal tensor methods for minimizing the gradient norm of convex functions and accelerated primal–dual tensor methods
- Title not available (Why is that?)
- Contracting Proximal Methods for Smooth Convex Optimization
- Affine-invariant contracting-point methods for convex optimization
- Title not available (Why is that?)
- Complexity bounds for second-order optimality in unconstrained optimization
- SketchySGD: reliable stochastic optimization via randomized curvature estimates
- Perseus: a simple and optimal high-order method for variational inequalities
- Regularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) Convergence
- An accelerated regularized Chebyshev-Halley method for unconstrained optimization
- Higher-Order Methods for Convex-Concave Min-Max Optimization and Monotone Variational Inequalities
This page was built for publication: Oracle complexity of second-order methods for smooth convex optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2330652)