Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods
From MaRDI portal
Publication:5058404
Recommendations
- An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods
- An optimal high-order tensor method for convex optimization
- Reachability of optimal convergence rate estimates for high-order numerical convex optimization methods
- Local convergence of tensor methods
- On inexact solution of auxiliary problems in tensor methods for convex optimization
Cites work
- scientific article; zbMATH DE number 1369459 (Why is no real title available?)
- scientific article; zbMATH DE number 3341597 (Why is no real title available?)
- scientific article; zbMATH DE number 3365044 (Why is no real title available?)
- A UNIFIED FRAMEWORK FOR SOME INEXACT PROXIMAL POINT ALGORITHMS*
- A control-theoretic perspective on optimal high-order optimization
- A dynamic approach to a proximal-Newton method for monotone inclusions in Hilbert spaces, with complexity \(\mathcal{O}(1/n^2)\)
- A hybrid approximate extragradient-proximal point algorithm using the enlargement of a maximal monotone operator
- A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives
- Accelerated regularized Newton methods for minimizing composite convex functions
- An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods
- An inexact hybrid generalized proximal point algorithm and some new results on the theory of Bregman functions
- An optimal high-order tensor method for convex optimization
- Complexity of Variants of Tseng's Modified F-B Splitting and Korpelevich's Methods for Hemivariational Inequalities with Applications to Saddle-point and Convex Optimization Problems
- Implementable tensor methods in unconstrained convex optimization
- Inexact accelerated high-order proximal-point methods
- Inexact high-order proximal-point methods with auxiliary search procedure
- Introductory lectures on convex optimization. A basic course.
- Iteration-complexity of a Newton proximal extragradient method for monotone variational inequalities and inclusion problems
- Minimizing uniformly convex functions by cubic regularization of Newton method
- Monotone Operators and the Proximal Point Algorithm
- On the complexity of the hybrid proximal extragradient method for the iterates and the ergodic mean
- Oracle complexity of second-order methods for smooth convex optimization
- Regularized HPE-Type Methods for Solving Monotone Inclusions with Improved Pointwise Iteration-Complexity Bounds
- Relative-error approximate versions of Douglas-Rachford splitting and special cases of the ADMM
- Smooth minimization of non-smooth functions
Cited in
(5)- A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives
- An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods
- Perseus: a simple and optimal high-order method for variational inequalities
- An accelerated regularized Chebyshev-Halley method for unconstrained optimization
- Reachability of optimal convergence rate estimates for high-order numerical convex optimization methods
This page was built for publication: Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5058404)