Generalized Momentum-Based Methods: A Hamiltonian Perspective
From MaRDI portal
Publication:5857293
DOI10.1137/20M1322716zbMath1462.90087arXiv1906.00436OpenAlexW3136523593MaRDI QIDQ5857293
Jelena Diakonikolas, Michael I. Jordan
Publication date: 31 March 2021
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1906.00436
Numerical mathematical programming methods (65K05) Convex programming (90C25) Duality theory (optimization) (49N15)
Related Items
Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization, A Systematic Approach to Lyapunov Analyses of Continuous-Time Models in Convex Optimization, Bregman dynamics, contact transformations and convex optimization, Recent Theoretical Advances in Non-Convex Optimization, Asymptotic analysis of a structure-preserving integrator for damped Hamiltonian systems, Implicit Regularization and Momentum Algorithms in Nonlinearly Parameterized Adaptive Control and Prediction
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Lectures on convex optimization
- Universal method for stochastic composite optimization problems
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- First-order optimization algorithms via inertial systems with Hessian driven damping
- Discrete processes and their continuous limits
- Lower bounds for finding stationary points I
- Accelerated primal-dual gradient descent with linesearch for convex, nonconvex, and nonsmooth optimization problems
- Generalized uniformly optimal methods for nonlinear programming
- On damped second-order gradient systems
- Behavior of accelerated gradient methods near critical points of nonconvex functions
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- Inertial Game Dynamics and Applications to Constrained Optimization
- Characterizations of Łojasiewicz inequalities: Subgradient flows, talweg, convexity
- On the long time behavior of second order differential equations with asymptotically small dissipation
- Optimal methods of smooth convex minimization
- New Proximal Point Algorithms for Convex Minimization
- THE HEAVY BALL WITH FRICTION METHOD, I. THE CONTINUOUS DYNAMICAL SYSTEM: GLOBAL EXPLORATION OF THE LOCAL MINIMA OF A REAL-VALUED FUNCTION BY ASYMPTOTIC ANALYSIS OF A DISSIPATIVE DYNAMICAL SYSTEM
- Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice
- Generalizing the Optimized Gradient Method for Smooth Convex Minimization
- An Optimal First Order Method Based on Optimal Quadratic Averaging
- The Approximate Duality Gap Technique: A Unified Theory of First-Order Methods
- Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent
- A variational perspective on accelerated methods in optimization
- Lower Bounds for Parallel and Randomized Convex Optimization
- Area-convexity, l ∞ regularization, and undirected multicommodity flow
- Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
- Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems
- An Almost-Linear-Time Algorithm for Approximate Max Flow in Undirected Graphs, and its Multicommodity Generalizations
- A new approach to computing maximum flows using electrical flows
- Some methods of speeding up the convergence of iteration methods
- Conformal symplectic and relativistic optimization