scientific article; zbMATH DE number 7306890
From MaRDI portal
Publication:5148992
Publication date: 5 February 2021
Full work available at URL: https://arxiv.org/abs/1711.09514
Title of this publication is not available (Why is that?)
optimizationstochastic differential equationweak convergenceordinary differential equationaccelerationgradient descentstochastic gradient descentmini-batchgradient flow central limit theoremjoint asymptotic analysisjoint computational and statistical analysisLagrangian flow central limit theorem
Cites Work
- Weak convergence and empirical processes. With applications to statistics
- Title not available (Why is that?)
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- Title not available (Why is that?)
- Introductory lectures on convex optimization. A basic course.
- Title not available (Why is that?)
- Title not available (Why is that?)
- Acceleration of Stochastic Approximation by Averaging
- An approximation of partial sums of independent RV's, and the sample DF. II
- An approximation of partial sums of independent RV'-s, and the sample DF. I
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Robust Stochastic Approximation Approach to Stochastic Programming
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Cube root asymptotics
- Nonlinear optimization.
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- Title not available (Why is that?)
- Title not available (Why is that?)
- Ergodicity for Infinite Dimensional Systems
- Numerical Methods for Ordinary Differential Equations
- Bootstrapping empirical functions
- Stochastic methods. A handbook for the natural and social sciences
- A differential equation for modeling Nesterov's accelerated gradient method: theory and insights
- Strong approximation for multivariate empirical and related processes, via KMT constructions
- Strong approximation for set-indexed partial sum processes via KMT constructions. I
- Approximation for bootstrapped empirical processes
- The exit problem for small random perturbations of dynamical systems with a hyperbolic fixed point
- Lectures on white noise functionals.
- Theoretical Guarantees for Approximate Sampling from Smooth and Log-Concave Densities
- Strong approximation for set-indexed partial-sum processes, via KMT constructions. II
- Scalable estimation strategies based on stochastic approximations: classical results and new insights
- Title not available (Why is that?)
- A variational perspective on accelerated methods in optimization
- Stochastic Gradient Descent in Continuous Time
- Asymptotic and finite-sample properties of estimators based on stochastic gradients
- User-friendly guarantees for the Langevin Monte Carlo with inaccurate gradient
- Statistical inference for model parameters in stochastic gradient descent
Cited In (5)
- Gradient procedures for stochastic approximation with dependent noise and their asymptotic behaviour
- Analysis of stochastic gradient descent in continuous time
- Central limit theorems for stochastic gradient descent with averaging for stable manifolds
- Discrete-time simulated annealing: a convergence analysis via the Eyring-Kramers law
- Asymptotic bias of stochastic gradient search
This page was built for publication:
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5148992)