The following pages link to (Q4617601):
Displaying 20 items.
- Is there an analog of Nesterov acceleration for gradient-based MCMC? (Q2040101) (← links)
- Efficient stochastic optimisation by unadjusted Langevin Monte Carlo. Application to maximum marginal likelihood and empirical Bayesian estimation (Q2058738) (← links)
- Unadjusted Langevin algorithm for sampling a mixture of weakly smooth potentials (Q2083423) (← links)
- Oracle lower bounds for stochastic gradient sampling algorithms (Q2137007) (← links)
- Improved bounds for discretization of Langevin diffusions: near-optimal rates without convexity (Q2137032) (← links)
- On sampling from a log-concave density using kinetic Langevin diffusions (Q2174987) (← links)
- User-friendly guarantees for the Langevin Monte Carlo with inaccurate gradient (Q2280028) (← links)
- Nonasymptotic bounds for sampling algorithms without log-concavity (Q2657917) (← links)
- (Q4998932) (← links)
- (Q5053256) (← links)
- Particle dual averaging: optimization of mean field neural network with global convergence rate analysis* (Q5055425) (← links)
- (Q5159403) (← links)
- On Stochastic Gradient Langevin Dynamics with Dependent Data Streams: The Fully Nonconvex Case (Q5162623) (← links)
- (Q5214293) (← links)
- (Q5381127) (← links)
- ALMOND: Adaptive Latent Modeling and Optimization via Neural Networks and Langevin Diffusion (Q6040682) (← links)
- Global Optimization via Schrödinger–Föllmer Diffusion (Q6057791) (← links)
- Swarm gradient dynamics for global optimization: the mean-field limit case (Q6126662) (← links)
- Mirror variational transport: a particle-based algorithm for distributional optimization on constrained domains (Q6134347) (← links)
- Distributed event-triggered unadjusted Langevin algorithm for Bayesian learning (Q6136164) (← links)