Superfast second-order methods for unconstrained convex optimization
From MaRDI portal
Publication:2664892
DOI10.1007/s10957-021-01930-yzbMath1480.90195OpenAlexW3092425772MaRDI QIDQ2664892
Publication date: 18 November 2021
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-021-01930-y
Related Items
Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization, Regularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) Convergence, Super-Universal Regularized Newton Method, Adaptive Third-Order Methods for Composite Convex Optimization, Inexact accelerated high-order proximal-point methods, Inexact High-Order Proximal-Point Methods with Auxiliary Search Procedure
Cites Work
- Unnamed Item
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Lectures on convex optimization
- Accelerating the cubic regularization of Newton's method on convex problems
- Implementable tensor methods in unconstrained convex optimization
- Oracle complexity of second-order methods for smooth convex optimization
- Cubic regularization of Newton method and its global performance
- An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- On inexact solution of auxiliary problems in tensor methods for convex optimization