Smoothness parameter of power of Euclidean norm
From MaRDI portal
Publication:2178876
DOI10.1007/s10957-020-01653-6zbMath1453.46041arXiv1907.12346OpenAlexW3100555527WikidataQ95275581 ScholiaQ95275581MaRDI QIDQ2178876
Yu. E. Nesterov, Anton Rodomanov
Publication date: 11 May 2020
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1907.12346
Related Items
Tensor methods for finding approximate stationary points of convex functions, New results on superlinear convergence of classical quasi-Newton methods, High-Order Optimization Methods for Fully Composite Problems
Uses Software
Cites Work
- Unnamed Item
- Universal gradient methods for convex optimization problems
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Accelerating the cubic regularization of Newton's method on convex problems
- On uniformly convex functionals
- Global convergence rate analysis of unconstrained optimization methods based on probabilistic models
- Minimizing uniformly convex functions by cubic regularization of Newton method
- Implementable tensor methods in unconstrained convex optimization
- Cubic regularization of Newton method and its global performance
- Tensor Methods for Unconstrained Optimization Using Second Derivatives
- Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions
- A concise second-order complexity analysis for unconstrained optimization using high-order regularized models
- Gradient Descent Finds the Cubic-Regularized Nonconvex Newton Step
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians