Nesterov's Method for Convex Optimization
From MaRDI portal
Publication:6046292
DOI10.1137/21m1390037zbMath1526.65025OpenAlexW4375860026WikidataQ130240292 ScholiaQ130240292MaRDI QIDQ6046292
Publication date: 10 May 2023
Published in: SIAM Review (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/21m1390037
Cites Work
- Nonlinear total variation based noise removal algorithms
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Axioms and fundamental equations of image processing
- Preconditioned accelerated gradient descent methods for locally Lipschitz smooth objectives with applications to the solution of nonlinear PDEs
- Accelerated primal-dual gradient descent with linesearch for convex, nonconvex, and nonsmooth optimization problems
- Minimization of functions having Lipschitz continuous first partial derivatives
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- Optimization—Theory and Practice
- Fast Proximal Methods via Time Scaling of Damped Inertial Dynamics
- Convex Analysis
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Nesterov's Method for Convex Optimization