A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights

From MaRDI portal
Publication:2834481

zbMath1391.90667arXiv1503.01243MaRDI QIDQ2834481

Emmanuel J. Candès, Weijie Su, Stephen P. Boyd

Publication date: 22 November 2016

Full work available at URL: https://arxiv.org/abs/1503.01243



Related Items

Fast convex optimization via inertial dynamics combining viscous and Hessian-driven damping with time rescaling, Semi-discrete optimization through semi-discrete optimal transport: a framework for neural architecture search, Convergence results of a new monotone inertial forward-backward splitting algorithm under the local Hölder error bound condition, Continuous dynamics related to monotone inclusions and non-smooth optimization problems, Accelerated methods with fastly vanishing subgradients for structured non-smooth minimization, Inducing strong convergence of trajectories in dynamical systems associated to monotone inclusions with composite structure, Asymptotic for a second order evolution equation with damping and regularizing terms, First-order optimization algorithms via inertial systems with Hessian driven damping, First-order inertial algorithms involving dry friction damping, Accelerated optimization on Riemannian manifolds via discrete constrained variational integrators, Sparse matrix linear models for structured high-throughput data, Asymptotic behavior of Newton-like inertial dynamics involving the sum of potential and nonpotential terms, Fast convergence of dynamical ADMM via time scaling of damped inertial dynamics, Blended dynamics approach to distributed optimization: sum convexity and convergence rate, Optimized first-order methods for smooth convex minimization, Algorithms of inertial mirror descent in convex problems of stochastic optimization, An \(O(s^r)\)-resolution ODE framework for understanding discrete-time algorithms and applications to the linear convergence of minimax problems, Proportional-integral projected gradient method for conic optimization, Inertial accelerated primal-dual methods for linear equality constrained convex optimization problems, Discrete dynamical system approaches for Boolean polynomial optimization, Fast inertial dynamic algorithm with smoothing method for nonsmooth convex optimization, New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure, Exploring critical points of energy landscapes: from low-dimensional examples to phase field crystal PDEs, Generalizing the Optimized Gradient Method for Smooth Convex Minimization, Optimal deterministic algorithm generation, Explicit stabilised gradient descent for faster strongly convex optimisation, Adaptive restart of the optimized gradient method for convex optimization, Proximal gradient flow and Douglas-Rachford splitting dynamics: global exponential stability via integral quadratic constraints, Robust hybrid zero-order optimization algorithms with acceleration via averaging in time, An engineering interpretation of Nesterov's convex minimization algorithm and time integration: application to optimal fiber orientation, A regularization interpretation of the proximal point method for weakly convex functions, A gradient-type algorithm with backward inertial steps associated to a nonconvex minimization problem, Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity, First-order frameworks for continuous Newton-like dynamics governed by maximally monotone operators, Newton-type inertial algorithms for solving monotone equations Governed by sums of potential and nonpotential operators, An optimal control theory for nonlinear optimization, Second-order flows for computing the ground states of rotating Bose-Einstein condensates, Accelerated gradient boosting, Convergence rates of the heavy-ball method under the Łojasiewicz property, Discrete processes and their continuous limits, Stochastic heavy ball, Network flows that solve least squares for linear equations, Bregman Itoh-Abe methods for sparse optimisation, Optimal decay rates for semi-linear non-autonomous evolution equations with vanishing damping, Optimal convergence rates for damped inertial gradient dynamics with flat geometries, The Differential Inclusion Modeling FISTA Algorithm and Optimality of Convergence Rate in the Case b $\leq3$, On FISTA with a relative error rule, Applying FISTA to optimization problems (with or) without minimizers, Fast convergence of inertial gradient dynamics with multiscale aspects, Halting time is predictable for large models: a universality property and average-case analysis, Rate of convergence of inertial gradient dynamics with time-dependent viscous damping coefficient, Inertial forward-backward algorithms with perturbations: application to Tikhonov regularization, Tikhonov regularization of a second order dynamical system with Hessian driven damping, Convergence Rates of Damped Inertial Dynamics under Geometric Conditions and Perturbations, Alternating minimization methods for strongly convex optimization, Improved convergence rates and trajectory convergence for primal-dual dynamical systems with vanishing damping, An extension of the second order dynamical system that models Nesterov's convex gradient method, Stochastic optimization with momentum: convergence, fluctuations, and traps avoidance, Convergence of inertial dynamics and proximal algorithms governed by maximally monotone operators, Accelerated proximal point method for maximally monotone operators, Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization, Asymptotic for the perturbed heavy ball system with vanishing damping term, A finite element/operator-splitting method for the numerical solution of the two dimensional elliptic Monge-Ampère equation, Selection dynamics for deep neural networks, On the convergence of the iterates of proximal gradient algorithm with extrapolation for convex nonsmooth minimization problems, Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions, Asymptotic stabilization of inertial gradient dynamics with time-dependent viscosity, Asymptotic analysis of a structure-preserving integrator for damped Hamiltonian systems, Lagrangian penalization scheme with parallel forward-backward splitting, A second-order adaptive Douglas-Rachford dynamic method for maximal \(\alpha\)-monotone operators, Second Order Forward-Backward Dynamical Systems For Monotone Inclusion Problems, On the convergence of a class of inertial dynamical systems with Tikhonov regularization, Continuous Newton-like inertial dynamics for monotone inclusions, Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule, New analysis of linear convergence of gradient-type methods via unifying error bound conditions, Fractional differential equation approach for convex optimization with convergence rate analysis, On dissipative symplectic integration with applications to gradient-based optimization, Convergence Rates of Inertial Primal-Dual Dynamical Methods for Separable Convex Optimization Problems, Search Direction Correction with Normalized Gradient Makes First-Order Methods Faster, Second order asymptotical regularization methods for inverse problems in partial differential equations, Nesterov-aided stochastic gradient methods using Laplace approximation for Bayesian design optimization, Asymptotic equivalence of evolution equations governed by cocoercive operators and their forward discretizations, A piecewise conservative method for unconstrained convex optimization, Iterative pre-conditioning for expediting the distributed gradient-descent method: the case of linear least-squares problem, Iterative ensemble Kalman methods: a unified perspective with some new variants, Stochastic relaxed inertial forward-backward-forward splitting for monotone inclusions in Hilbert spaces, Some modified fast iterative shrinkage thresholding algorithms with a new adaptive non-monotone stepsize strategy for nonsmooth and convex minimization problems, Convergence rates of first- and higher-order dynamics for solving linear ill-posed problems, Understanding the acceleration phenomenon via high-resolution differential equations, From differential equation solvers to accelerated first-order methods for convex optimization, A control-theoretic perspective on optimal high-order optimization, Convergence rates for the heavy-ball continuous dynamics for non-convex optimization, under Polyak-Łojasiewicz condition, Convergence rates of damped inerial dynamics from multi-degree-of-freedom system, Generalized affine scaling algorithms for linear programming problems, Convergence of a relaxed inertial forward-backward algorithm for structured monotone inclusions, High-performance optimal incentive-seeking in transactive control for traffic congestion, Fast primal-dual algorithm via dynamical system for a linearly constrained convex optimization problem, On the effect of perturbations in first-order optimization methods with inertia and Hessian driven damping, A fast continuous time approach with time scaling for nonsmooth convex optimization, Generating Nesterov's accelerated gradient algorithm by using optimal control theory for optimization, Deriving efficient optimization methods based on stable explicit numerical methods, Essential convergence rate of ordinary differential equations appearing in optimization, The rate of convergence of optimization algorithms obtained via discretizations of heavy ball dynamical systems for convex optimization problems, An Efficient Algorithm for Minimizing Multi Non-Smooth Component Functions, PROBLEMS OF DIFFERENTIAL AND TOPOLOGICAL DIAGNOSTICS. PART 5. THE CASE OF TRAJECTORIAL MEASUREMENTS WITH ERROR, Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method, Unnamed Item, Continuous-Time Convergence Rates in Potential and Monotone Games, Running Primal-Dual Gradient Method for Time-Varying Nonconvex Problems, Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization, Accelerated differential inclusion for convex optimization, On the strong convergence of the trajectories of a Tikhonov regularized second order dynamical system with asymptotically vanishing damping, Fast augmented Lagrangian method in the convex regime with convergence guarantees for the iterates, Nesterov's Method for Convex Optimization, Accelerated dynamics with dry friction via time scaling and averaging of doubly nonlinear evolution equations, Unnamed Item, A second-order accelerated neurodynamic approach for distributed convex optimization, Inertial primal-dual dynamics with damping and scaling for linearly constrained convex optimization problems, Fast convergence of inertial dynamics with Hessian-driven damping under geometry assumptions, Fast and stable composite learning via high‐order optimization, On the strong convergence of continuous Newton-like inertial dynamics with Tikhonov regularization for monotone inclusions, A Unifying Framework and Comparison of Algorithms for Non‐negative Matrix Factorisation, Fast continuous dynamics inside the graph of subdifferentials of nonsmooth convex functions, An ordinary differential equation for modeling Halpern fixed-point Algorithm, Nesterov's acceleration for level set-based topology optimization using reaction-diffusion equations, Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods, Distributed solving linear algebraic equations with switched fractional order dynamics, Robust cost-sensitive kernel method with Blinex loss and its applications in credit risk evaluation, Strong Convergence of Trajectories via Inertial Dynamics Combining Hessian-Driven Damping and Tikhonov Regularization for General Convex Minimizations, A speed restart scheme for a dynamics with Hessian-driven damping, Practical perspectives on symplectic accelerated optimization, Friction-adaptive descent: a family of dynamics-based optimization methods, On the second-order asymptotical regularization of linear ill-posed inverse problems, On the Generalized Langevin Equation for Simulated Annealing, Deterministic neural networks optimization from a continuous and energy point of view, Accelerated smoothing hard thresholding algorithms for \(\ell_0\) regularized nonsmooth convex regression problem, A Systematic Approach to Lyapunov Analyses of Continuous-Time Models in Convex Optimization, FISTA is an automatic geometrically optimized algorithm for strongly convex functions, No-regret algorithms in on-line learning, games and convex optimization, Unnamed Item, Unnamed Item, Unnamed Item, Convergence rate of a relaxed inertial proximal algorithm for convex minimization, A class of modified accelerated proximal gradient methods for nonsmooth and nonconvex minimization problems, First order inertial optimization algorithms with threshold effects associated with dry friction, SOLO FTRL algorithm for production management with transfer prices, A Convergence Study of SGD-Type Methods for Stochastic Optimization, Time-adaptive Lagrangian variational integrators for accelerated optimization, Convergence of inertial dynamics driven by sums of potential and nonpotential operators with implicit Newton-like damping, A forward-backward algorithm with different inertial terms for structured non-convex minimization problems, Time rescaling of a primal-dual dynamical system with asymptotically vanishing damping, Accelerated gradient methods combining Tikhonov regularization with geometric damping driven by the Hessian, First-order methods for convex optimization, Bregman dynamics, contact transformations and convex optimization, Fast optimization via inertial dynamics with closed-loop damping, Existence results on Lagrange multiplier approach for gradient flows and application to optimization, A second order primal-dual dynamical system for a convex-concave bilinear saddle point problem, Lagrangian and Hamiltonian dynamics for probabilities on the statistical bundle, On mathematical modeling in image reconstruction and beyond, Contractivity of Runge--Kutta Methods for Convex Gradient Systems, Finite Convergence of Proximal-Gradient Inertial Algorithms Combining Dry Friction with Hessian-Driven Damping, Convergence Rates of Inertial Forward-Backward Algorithms, Multiscale Analysis of Accelerated Gradient Methods, Unnamed Item, Second-order dynamical systems with penalty terms associated to monotone inclusions, The Approximate Duality Gap Technique: A Unified Theory of First-Order Methods, Maximum Likelihood Estimation of Regularization Parameters in High-Dimensional Inverse Problems: An Empirical Bayesian Approach. Part II: Theoretical Analysis, Asymptotic for a second-order evolution equation with convex potential andvanishing damping term, PROBLEMS OF DIFFERENTIAL AND TOPOLOGICAL DIAGNOSTICS. PART 4. THE CASE OF EXACT TRAJECTORIAL MEASUREMENTS, Unnamed Item, Unnamed Item, Image Restoration: Wavelet Frame Shrinkage, Nonlinear Evolution PDEs, and Beyond, Accelerated Residual Methods for the Iterative Solution of Systems of Equations, Coherence Retrieval Using Trace Regularization, Analysis of Optimization Algorithms via Integral Quadratic Constraints: Nonstrongly Convex Problems, Proximal Distance Algorithms: Theory and Examples, PROBLEMS OF DIFFERENTIAL AND TOPOLOGICAL DIAGNOSTICS. PART 1. MOTION EQUATIONS AND CLASSIFICATION OF MALFUNCTIONS, Optimal Convergence Rates for Nesterov Acceleration, A second-order dynamical approach with variable damping to nonconvex smooth minimization, Accelerated First-Order Primal-Dual Proximal Methods for Linearly Constrained Composite Convex Programming, A Dynamical Approach to Two-Block Separable Convex Optimization Problems with Linear Constraints, The Connections Between Lyapunov Functions for Some Optimization Algorithms and Differential Equations, Partial differential equation regularization for supervised machine learning, Unnamed Item, A new class of accelerated regularization methods, with application to bioluminescence tomography, Implicit Regularization and Momentum Algorithms in Nonlinearly Parameterized Adaptive Control and Prediction, Accelerated Iterative Regularization via Dual Diagonal Descent, Generalized Momentum-Based Methods: A Hamiltonian Perspective, Unnamed Item, Conformal symplectic and relativistic optimization, Adaptive Hamiltonian Variational Integrators and Applications to Symplectic Accelerated Optimization, Time-varying continuous-time optimisation with pre-defined finite-time stability, A Class of Second-Order Geometric Quasilinear Hyperbolic PDEs and Their Application in Imaging, Tikhonov Regularization of a Perturbed Heavy Ball System with Vanishing Damping, A Variational Formulation of Accelerated Optimization on Riemannian Manifolds, A primal-dual flow for affine constrained convex optimization, On the Effectiveness of Richardson Extrapolation in Data Science, An Optimal High-Order Tensor Method for Convex Optimization, Proximal Gradient Methods for Machine Learning and Imaging