Pages that link to "Item:Q2834481"
From MaRDI portal
The following pages link to A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights (Q2834481):
Displaying 50 items.
- Optimized first-order methods for smooth convex minimization (Q312663) (← links)
- Asymptotic for the perturbed heavy ball system with vanishing damping term (Q523947) (← links)
- Lagrangian penalization scheme with parallel forward-backward splitting (Q725876) (← links)
- Continuous dynamics related to monotone inclusions and non-smooth optimization problems (Q829490) (← links)
- Inducing strong convergence of trajectories in dynamical systems associated to monotone inclusions with composite structure (Q831051) (← links)
- Algorithms of inertial mirror descent in convex problems of stochastic optimization (Q1641948) (← links)
- New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure (Q1659678) (← links)
- Optimal deterministic algorithm generation (Q1668803) (← links)
- Adaptive restart of the optimized gradient method for convex optimization (Q1670019) (← links)
- Stochastic heavy ball (Q1697485) (← links)
- Rate of convergence of inertial gradient dynamics with time-dependent viscous damping coefficient (Q1711864) (← links)
- Inertial forward-backward algorithms with perturbations: application to Tikhonov regularization (Q1730794) (← links)
- Convergence of inertial dynamics and proximal algorithms governed by maximally monotone operators (Q1739043) (← links)
- Asymptotic stabilization of inertial gradient dynamics with time-dependent viscosity (Q1785926) (← links)
- A finite element/operator-splitting method for the numerical solution of the two dimensional elliptic Monge-Ampère equation (Q2000024) (← links)
- Selection dynamics for deep neural networks (Q2003969) (← links)
- On the convergence of the iterates of proximal gradient algorithm with extrapolation for convex nonsmooth minimization problems (Q2010091) (← links)
- Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions (Q2020604) (← links)
- Asymptotic analysis of a structure-preserving integrator for damped Hamiltonian systems (Q2030822) (← links)
- A second-order adaptive Douglas-Rachford dynamic method for maximal \(\alpha\)-monotone operators (Q2031284) (← links)
- On the convergence of a class of inertial dynamical systems with Tikhonov regularization (Q2047197) (← links)
- Continuous Newton-like inertial dynamics for monotone inclusions (Q2047250) (← links)
- A piecewise conservative method for unconstrained convex optimization (Q2070340) (← links)
- Iterative pre-conditioning for expediting the distributed gradient-descent method: the case of linear least-squares problem (Q2071934) (← links)
- Iterative ensemble Kalman methods: a unified perspective with some new variants (Q2072640) (← links)
- Stochastic relaxed inertial forward-backward-forward splitting for monotone inclusions in Hilbert spaces (Q2082546) (← links)
- Some modified fast iterative shrinkage thresholding algorithms with a new adaptive non-monotone stepsize strategy for nonsmooth and convex minimization problems (Q2082554) (← links)
- Convergence rates of first- and higher-order dynamics for solving linear ill-posed problems (Q2088139) (← links)
- Understanding the acceleration phenomenon via high-resolution differential equations (Q2089769) (← links)
- From differential equation solvers to accelerated first-order methods for convex optimization (Q2089788) (← links)
- A control-theoretic perspective on optimal high-order optimization (Q2089793) (← links)
- Convergence rates for the heavy-ball continuous dynamics for non-convex optimization, under Polyak-Łojasiewicz condition (Q2089864) (← links)
- Convergence rates of damped inerial dynamics from multi-degree-of-freedom system (Q2091224) (← links)
- High-performance optimal incentive-seeking in transactive control for traffic congestion (Q2095352) (← links)
- Fast primal-dual algorithm via dynamical system for a linearly constrained convex optimization problem (Q2097697) (← links)
- On the effect of perturbations in first-order optimization methods with inertia and Hessian driven damping (Q2106043) (← links)
- A fast continuous time approach with time scaling for nonsmooth convex optimization (Q2110501) (← links)
- Generating Nesterov's accelerated gradient algorithm by using optimal control theory for optimization (Q2112702) (← links)
- Fast convex optimization via inertial dynamics combining viscous and Hessian-driven damping with time rescaling (Q2119789) (← links)
- Semi-discrete optimization through semi-discrete optimal transport: a framework for neural architecture search (Q2121586) (← links)
- Convergence results of a new monotone inertial forward-backward splitting algorithm under the local Hölder error bound condition (Q2128612) (← links)
- Accelerated methods with fastly vanishing subgradients for structured non-smooth minimization (Q2129627) (← links)
- Asymptotic for a second order evolution equation with damping and regularizing terms (Q2133235) (← links)
- First-order optimization algorithms via inertial systems with Hessian driven damping (Q2133411) (← links)
- First-order inertial algorithms involving dry friction damping (Q2133421) (← links)
- Accelerated optimization on Riemannian manifolds via discrete constrained variational integrators (Q2134934) (← links)
- Sparse matrix linear models for structured high-throughput data (Q2135347) (← links)
- Asymptotic behavior of Newton-like inertial dynamics involving the sum of potential and nonpotential terms (Q2138450) (← links)
- Fast convergence of dynamical ADMM via time scaling of damped inertial dynamics (Q2139279) (← links)
- Blended dynamics approach to distributed optimization: sum convexity and convergence rate (Q2139400) (← links)