A variational perspective on accelerated methods in optimization

From MaRDI portal
Publication:4646231

DOI10.1073/pnas.1614734113zbMath1404.90098arXiv1603.04245OpenAlexW2963430672WikidataQ37451331 ScholiaQ37451331MaRDI QIDQ4646231

Ashia C. Wilson, Andre Wibisono, Michael I. Jordan

Publication date: 11 January 2019

Published in: Proceedings of the National Academy of Sciences (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1603.04245



Related Items

Inducing strong convergence of trajectories in dynamical systems associated to monotone inclusions with composite structure, Stochastic mirror descent dynamics and their convergence in monotone variational inequalities, Essential convergence rate of ordinary differential equations appearing in optimization, Accelerated optimization on Riemannian manifolds via discrete constrained variational integrators, An \(O(s^r)\)-resolution ODE framework for understanding discrete-time algorithms and applications to the linear convergence of minimax problems, Accelerated Optimization in the PDE Framework: Formulations for the Manifold of Diffeomorphisms, Stochastic Methods for Composite and Weakly Convex Optimization Problems, Finding geodesics joining given points, Unnamed Item, Exploring critical points of energy landscapes: from low-dimensional examples to phase field crystal PDEs, Explicit stabilised gradient descent for faster strongly convex optimisation, Proximal gradient flow and Douglas-Rachford splitting dynamics: global exponential stability via integral quadratic constraints, Robust hybrid zero-order optimization algorithms with acceleration via averaging in time, Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization, Accelerated differential inclusion for convex optimization, Inertial primal-dual dynamics with damping and scaling for linearly constrained convex optimization problems, Neurodynamic optimization approaches with finite/fixed-time convergence for absolute value equations, An ordinary differential equation for modeling Halpern fixed-point Algorithm, Novel projection neurodynamic approaches for constrained convex optimization, Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods, Practical perspectives on symplectic accelerated optimization, Deterministic neural networks optimization from a continuous and energy point of view, Fast gradient method for low-rank matrix estimation, A Systematic Approach to Lyapunov Analyses of Continuous-Time Models in Convex Optimization, No-regret algorithms in on-line learning, games and convex optimization, Unnamed Item, Unnamed Item, No-regret dynamics in the Fenchel game: a unified framework for algorithmic convex optimization, Conformal mirror descent with logarithmic divergences, A new dynamical system with self-adaptive dynamical stepsize for pseudomonotone mixed variational inequalities, A Recursively Recurrent Neural Network (R2N2) Architecture for Learning Iterative Algorithms, Discrete processes and their continuous limits, Time-adaptive Lagrangian variational integrators for accelerated optimization, First-order methods for convex optimization, Bregman dynamics, contact transformations and convex optimization, From Halpern's fixed-point iterations to Nesterov's accelerated interpretations for root-finding problems, A second order primal-dual dynamical system for a convex-concave bilinear saddle point problem, Lagrangian and Hamiltonian dynamics for probabilities on the statistical bundle, On the Convergence of Gradient-Like Flows with Noisy Gradient Input, Contractivity of Runge--Kutta Methods for Convex Gradient Systems, Bregman Itoh-Abe methods for sparse optimisation, Multiscale Analysis of Accelerated Gradient Methods, Unnamed Item, A Continuous-Time Analysis of Distributed Stochastic Gradient, Dynamical Systems Theory and Algorithms for NP-hard Problems, The Approximate Duality Gap Technique: A Unified Theory of First-Order Methods, Accelerating Proximal Markov Chain Monte Carlo by Using an Explicit Stabilized Method, Accelerated Optimization in the PDE Framework Formulations for the Active Contour Case, Preconditioned accelerated gradient descent methods for locally Lipschitz smooth objectives with applications to the solution of nonlinear PDEs, Selection dynamics for deep neural networks, User-friendly guarantees for the Langevin Monte Carlo with inaccurate gradient, Analysis of Optimization Algorithms via Integral Quadratic Constraints: Nonstrongly Convex Problems, Exponential convergence of distributed primal-dual convex optimization algorithm without strong convexity, Mass-spring-damper networks for distributed optimization in non-Euclidean spaces, Is there an analog of Nesterov acceleration for gradient-based MCMC?, Continuous relaxations for the traveling salesman problem, Projected Dynamical Systems on Irregular, Non-Euclidean Domains for Nonlinear Optimization, Fractional differential equation approach for convex optimization with convergence rate analysis, On dissipative symplectic integration with applications to gradient-based optimization, Convergence Rates of Inertial Primal-Dual Dynamical Methods for Separable Convex Optimization Problems, Search Direction Correction with Normalized Gradient Makes First-Order Methods Faster, Accelerated information gradient flow, Accelerated variational PDEs for efficient solution of regularized inversion problems, Accelerated First-Order Primal-Dual Proximal Methods for Linearly Constrained Composite Convex Programming, Hessian Barrier Algorithms for Linearly Constrained Optimization Problems, Revisiting the ODE method for recursive algorithms: fast convergence using quasi stochastic approximation, Iterative ensemble Kalman methods: a unified perspective with some new variants, The Connections Between Lyapunov Functions for Some Optimization Algorithms and Differential Equations, How does momentum benefit deep neural networks architecture design? A few case studies, Unnamed Item, Unified Acceleration of High-Order Algorithms under General Hölder Continuity, Understanding the acceleration phenomenon via high-resolution differential equations, From differential equation solvers to accelerated first-order methods for convex optimization, A control-theoretic perspective on optimal high-order optimization, Implicit Regularization and Momentum Algorithms in Nonlinearly Parameterized Adaptive Control and Prediction, PDE acceleration: a convergence rate analysis and applications to obstacle problems, High-order symplectic Lie group methods on \(SO(n)\) using the polar decomposition, Finding extremals of Lagrangian actions, Generalized Momentum-Based Methods: A Hamiltonian Perspective, Unnamed Item, Conformal symplectic and relativistic optimization, Adaptive Hamiltonian Variational Integrators and Applications to Symplectic Accelerated Optimization, Fast primal-dual algorithm via dynamical system for a linearly constrained convex optimization problem, Online optimization of switched LTI systems using continuous-time and hybrid accelerated gradient flows, Triggered gradient tracking for asynchronous distributed optimization, A Variational Formulation of Accelerated Optimization on Riemannian Manifolds, A primal-dual flow for affine constrained convex optimization, An Optimal High-Order Tensor Method for Convex Optimization