Potential-function proofs for gradient methods
From MaRDI portal
Publication:5204822
DOI10.4086/TOC.2019.V015A004zbMATH Open1482.90145OpenAlexW2981349516MaRDI QIDQ5204822FDOQ5204822
Authors: N. Bansal, Anupam Gupta
Publication date: 5 December 2019
Published in: Theory of Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.4086/toc.2019.v015a004
Recommendations
- Convergence of first-order methods via the convex conjugate
- On the convergence analysis of the optimized gradient method
- Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization
- Gradient Convergence in Gradient methods with Errors
- First-order methods for convex optimization
Cited In (10)
- Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization
- A Systematic Approach to Lyapunov Analyses of Continuous-Time Models in Convex Optimization
- The regularized submodular maximization via the Lyapunov method
- The rate of convergence of optimization algorithms obtained via discretizations of heavy ball dynamical systems for convex optimization problems
- Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods
- A Laplacian approach to \(\ell_1\)-norm minimization
- Recent theoretical advances in decentralized distributed convex optimization
- A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives
- An optimal gradient method for smooth strongly convex minimization
- No-regret algorithms in on-line learning, games and convex optimization
This page was built for publication: Potential-function proofs for gradient methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5204822)