Near-optimal hyperfast second-order method for convex optimization
From MaRDI portal
Publication:4965110
DOI10.1007/978-3-030-58657-7_15zbMATH Open1460.90133arXiv2002.09050OpenAlexW3084935096MaRDI QIDQ4965110FDOQ4965110
Authors: Dmitry Kamzolov
Publication date: 25 February 2021
Published in: Mathematical Optimization Theory and Operations Research (Search for Journal in Brave)
Abstract: In this paper, we present a new Hyperfast Second-Order Method with convergence rate up to a logarithmic factor for the convex function with Lipshitz the third derivative. This method based on two ideas. The first comes from the superfast second-order scheme of Yu. Nesterov (CORE Discussion Paper 2020/07, 2020). It allows implementing the third-order scheme by solving subproblem using only the second-order oracle. This method converges with rate . The second idea comes from the work of Kamzolov et al. (arXiv:2002.01004). It is the inexact near-optimal third-order method. In this work, we improve its convergence and merge it with the scheme of solving subproblem using only the second-order oracle. As a result, we get convergence rate up to a logarithmic factor. This convergence rate is near-optimal and the best known up to this moment. Further, we investigate the situation when there is a sum of two functions and improve the sliding framework from Kamzolov et al. (arXiv:2002.01004) for the second-order methods.
Full work available at URL: https://arxiv.org/abs/2002.09050
Recommendations
- Superfast second-order methods for unconstrained convex optimization
- Reachability of optimal convergence rate estimates for high-order numerical convex optimization methods
- Implementable tensor methods in unconstrained convex optimization
- Fast convex optimization via a third-order in time evolution equation
- An optimal high-order tensor method for convex optimization
Cites Work
- An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods
- Cubic regularization of Newton method and its global performance
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Lectures on convex optimization
- Tensor methods for minimizing convex functions with Hölder continuous higher-order derivatives
- Lower bounds for finding stationary points I
- Accelerated regularized Newton methods for minimizing composite convex functions
- Contracting proximal methods for smooth convex optimization
Cited In (10)
- Super-Universal Regularized Newton Method
- Machine Learning: ECML 2004
- Local convergence of tensor methods
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- Fast Curvature Matrix-Vector Products for Second-Order Gradient Descent
- Superfast second-order methods for unconstrained convex optimization
- A control-theoretic perspective on optimal high-order optimization
- Fast convex optimization via a third-order in time evolution equation
- Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization
- Accelerations for global optimization covering methods using second derivatives
This page was built for publication: Near-optimal hyperfast second-order method for convex optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4965110)