Superfast second-order methods for unconstrained convex optimization
From MaRDI portal
Publication:2664892
DOI10.1007/S10957-021-01930-YzbMATH Open1480.90195OpenAlexW3092425772MaRDI QIDQ2664892FDOQ2664892
Authors: Yuri Nesterov
Publication date: 18 November 2021
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-021-01930-y
Recommendations
- Near-optimal hyperfast second-order method for convex optimization
- A second order method for unconstrained optimization
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- Accelerated methods for nonconvex optimization
- An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods
- Accelerated Uzawa methods for convex optimization
- Faster Lagrangian-based methods in convex optimization
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- Fast alternating direction optimization methods
- A family of second-order methods for convex \(\ell _1\)-regularized optimization
Cites Work
- Title not available (Why is that?)
- An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods
- Cubic regularization of Newton method and its global performance
- Accelerating the cubic regularization of Newton's method on convex problems
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Oracle complexity of second-order methods for smooth convex optimization
- Lectures on convex optimization
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- Implementable tensor methods in unconstrained convex optimization
- On inexact solution of auxiliary problems in tensor methods for convex optimization
Cited In (13)
- Super-Universal Regularized Newton Method
- Inexact tensor methods and their application to stochastic convex optimization
- Adaptive Third-Order Methods for Composite Convex Optimization
- Inexact High-Order Proximal-Point Methods with Auxiliary Search Procedure
- Fast Curvature Matrix-Vector Products for Second-Order Gradient Descent
- High-order methods beyond the classical complexity bounds: inexact high-order proximal-point methods
- Near-optimal tensor methods for minimizing the gradient norm of convex functions and accelerated primal–dual tensor methods
- A concise second-order complexity analysis for unconstrained optimization using high-order regularized models
- Inexact accelerated high-order proximal-point methods
- Fast convex optimization via a third-order in time evolution equation
- Perseus: a simple and optimal high-order method for variational inequalities
- Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization
- Regularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) Convergence
This page was built for publication: Superfast second-order methods for unconstrained convex optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2664892)