Contracting Proximal Methods for Smooth Convex Optimization
From MaRDI portal
Publication:5139832
DOI10.1137/19M130769XzbMath1454.90052arXiv1912.07972MaRDI QIDQ5139832
Nikita Doikov, Yu. E. Nesterov
Publication date: 11 December 2020
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1912.07972
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonlinear programming (90C30)
Related Items
Tensor methods for finding approximate stationary points of convex functions, Accelerated meta-algorithm for convex optimization problems, Accelerated smoothing hard thresholding algorithms for \(\ell_0\) regularized nonsmooth convex regression problem, Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization, Super-Universal Regularized Newton Method, An inexact primal-dual smoothing framework for large-scale non-bilinear saddle point problems, Gradient regularization of Newton method with Bregman distances, Adaptive Catalyst for Smooth Convex Optimization, Near-Optimal Hyperfast Second-Order Method for Convex Optimization, Greedy Quasi-Newton Methods with Explicit Superlinear Convergence, High-Order Optimization Methods for Fully Composite Problems, Accelerated proximal envelopes: application to componentwise methods, On the computational efficiency of catalyst accelerated coordinate descent
Cites Work
- Unnamed Item
- Unnamed Item
- Gradient methods for minimizing composite functions
- Lectures on convex optimization
- Accelerating the cubic regularization of Newton's method on convex problems
- Universal method for stochastic composite optimization problems
- Implementable tensor methods in unconstrained convex optimization
- Oracle complexity of second-order methods for smooth convex optimization
- Forward-backward splitting with Bregman distances
- Cubic regularization of Newton method and its global performance
- A UNIFIED FRAMEWORK FOR SOME INEXACT PROXIMAL POINT ALGORITHMS*
- An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods
- On the Convergence of the Proximal Point Algorithm for Convex Minimization
- New Proximal Point Algorithms for Convex Minimization
- Monotone Operators and the Proximal Point Algorithm
- Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications