Implementable tensor methods in unconstrained convex optimization

From MaRDI portal
Publication:2227532

DOI10.1007/s10107-019-01449-1zbMath1459.90157OpenAlexW2808862920MaRDI QIDQ2227532

Yu. E. Nesterov

Publication date: 15 February 2021

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/s10107-019-01449-1



Related Items

Tensor methods for finding approximate stationary points of convex functions, Inexact basic tensor methods for some classes of convex optimization problems, Local convergence of tensor methods, Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods, Optimal complexity and certification of Bregman first-order methods, Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle, Accelerated meta-algorithm for convex optimization problems, Superfast second-order methods for unconstrained convex optimization, Smoothness parameter of power of Euclidean norm, Cubic regularization methods with second-order complexity guarantee based on a new subproblem reformulation, Regularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) Convergence, Super-Universal Regularized Newton Method, Efficiency of higher-order algorithms for minimizing composite functions, Smooth monotone stochastic variational inequalities and saddle point problems: a survey, First-order methods for convex optimization, Adaptive Third-Order Methods for Composite Convex Optimization, Inexact accelerated high-order proximal-point methods, Zeroth-order nonconvex stochastic optimization: handling constraints, high dimensionality, and saddle points, A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization, Contracting Proximal Methods for Smooth Convex Optimization, Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems, Accelerated Bregman proximal gradient methods for relatively smooth convex optimization, Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives, An adaptive high order method for finding third-order critical points of nonconvex optimization, Unified Acceleration of High-Order Algorithms under General Hölder Continuity, Critical Point-Finding Methods Reveal Gradient-Flat Regions of Deep Network Losses, A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima, Dual Space Preconditioning for Gradient Descent, Inexact High-Order Proximal-Point Methods with Auxiliary Search Procedure, Inexact model: a framework for optimization and variational inequalities, Higher-Order Methods for Convex-Concave Min-Max Optimization and Monotone Variational Inequalities, High-Order Optimization Methods for Fully Composite Problems, An Optimal High-Order Tensor Method for Convex Optimization


Uses Software


Cites Work