Taming Neural Networks with TUSLA: Nonconvex Learning via Adaptive Stochastic Gradient Langevin Algorithms
From MaRDI portal
Publication:6162009
DOI10.1137/22m1514283zbMath1518.65007arXiv2006.14514OpenAlexW3037401783MaRDI QIDQ6162009
Sotirios Sabanis, Attila Lovas, Miklós Rásonyi, Unnamed Author
Publication date: 28 June 2023
Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2006.14514
Artificial neural networks and deep learning (68T07) Monte Carlo methods (65C05) Stochastic learning and adaptive control (93E35) Sequential statistical analysis (62L10)
Related Items
Cites Work
- Euler approximations with varying coefficients: the case of superlinearly growing diffusion coefficients
- Strong convergence of an explicit numerical method for SDEs with nonglobally Lipschitz continuous coefficients
- A note on tamed Euler approximations
- Laplace's method revisited: Weak convergence of probability measures
- On stochastic gradient Langevin dynamics with dependent data streams in the logconcave case
- The tamed unadjusted Langevin algorithm
- User-friendly guarantees for the Langevin Monte Carlo with inaccurate gradient
- High-dimensional Bayesian inference via the unadjusted Langevin algorithm
- Higher order Langevin Monte Carlo algorithm
- Couplings and quantitative contraction rates for Langevin dynamics
- Nonasymptotic convergence analysis for the unadjusted Langevin algorithm
- Nonasymptotic estimates for stochastic gradient Langevin dynamics under local conditions in nonconvex optimization
- Strong and weak divergence in finite time of Euler's method for stochastic differential equations with non-globally Lipschitz continuous coefficients
- Quantitative Harris-type theorems for diffusions and McKean–Vlasov processes
- Convergence and Dynamical Behavior of the ADAM Algorithm for Nonconvex Stochastic Optimization
- On Stochastic Gradient Langevin Dynamics with Dependent Data Streams: The Fully Nonconvex Case
- Theoretical Guarantees for Approximate Sampling from Smooth and Log-Concave Densities