Near-optimal hyperfast second-order method for convex optimization

From MaRDI portal
Publication:4965110

DOI10.1007/978-3-030-58657-7_15zbMATH Open1460.90133arXiv2002.09050OpenAlexW3084935096MaRDI QIDQ4965110FDOQ4965110


Authors: Dmitry Kamzolov Edit this on Wikidata


Publication date: 25 February 2021

Published in: Mathematical Optimization Theory and Operations Research (Search for Journal in Brave)

Abstract: In this paper, we present a new Hyperfast Second-Order Method with convergence rate O(N5) up to a logarithmic factor for the convex function with Lipshitz the third derivative. This method based on two ideas. The first comes from the superfast second-order scheme of Yu. Nesterov (CORE Discussion Paper 2020/07, 2020). It allows implementing the third-order scheme by solving subproblem using only the second-order oracle. This method converges with rate O(N4). The second idea comes from the work of Kamzolov et al. (arXiv:2002.01004). It is the inexact near-optimal third-order method. In this work, we improve its convergence and merge it with the scheme of solving subproblem using only the second-order oracle. As a result, we get convergence rate O(N5) up to a logarithmic factor. This convergence rate is near-optimal and the best known up to this moment. Further, we investigate the situation when there is a sum of two functions and improve the sliding framework from Kamzolov et al. (arXiv:2002.01004) for the second-order methods.


Full work available at URL: https://arxiv.org/abs/2002.09050




Recommendations




Cites Work


Cited In (10)





This page was built for publication: Near-optimal hyperfast second-order method for convex optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4965110)