Log-concave sampling: Metropolis-Hastings algorithms are fast
From MaRDI portal
Publication:5214293
zbMATH Open1440.62039arXiv1801.02309MaRDI QIDQ5214293FDOQ5214293
Authors: Raaz Dwivedi, Yuansi Chen, Martin J. Wainwright, Bin Yu
Publication date: 7 February 2020
Full work available at URL: https://arxiv.org/abs/1801.02309
Recommendations
- Error bounds for Metropolis-Hastings algorithms applied to perturbations of Gaussian measures in high dimensions
- Sampling from a log-concave distribution with projected Langevin Monte Carlo
- Fast mixing of Metropolized Hamiltonian Monte Carlo: benefits of multi-step gradients
- The geometry of logconcave functions and sampling algorithms
- Theoretical Guarantees for Approximate Sampling from Smooth and Log-Concave Densities
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Exponential convergence of Langevin distributions and their discrete approximations
- Markov Chains and Stochastic Stability
- Title not available (Why is that?)
- Handbook of Markov Chain Monte Carlo
- Monte Carlo sampling methods using Markov chains and their applications
- Langevin diffusions and Metropolis-Hastings algorithms
- Optimal scaling for various Metropolis-Hastings algorithms.
- Geometric ergodicity of Metropolis algorithms
- Minimising MCMC variance via diffusion limits, with an application to simulated tempering
- Geometric convergence and central limit theorems for multidimensional Hastings and Metropolis algorithms
- Equation of state calculations by fast computing machines
- A tail inequality for quadratic forms of subgaussian random vectors
- General state space Markov chains and MCMC algorithms
- Rates of convergence of the Hastings and Metropolis algorithms
- Expansion of the global error for numerical schemes solving stochastic differential equations
- Isoperimetric problems for convex bodies and a localization lemma
- Hit-and-run mixes fast
- Randomized interior point methods for sampling and optimization
- Random walks in a convex body and an improved volume algorithm
- A random polynomial-time algorithm for approximating the volume of convex bodies
- Title not available (Why is that?)
- Hit-and-Run from a Corner
- Computable bounds for geometric convergence rates of Markov chains
- A cubic algorithm for computing Gaussian volume
- Hit-and-Run Algorithms for Generating Multivariate Distributions
- Convex optimization: algorithms and complexity
- Error bounds for Metropolis-Hastings algorithms applied to perturbations of Gaussian measures in high dimensions
- Nonasymptotic mixing of the MALA algorithm
- Title not available (Why is that?)
- The Markov moment problem and de Finetti's theorem. II
- The geometry of logconcave functions and sampling algorithms
- Optimal scaling and diffusion limits for the Langevin algorithm in high dimensions
- High-dimensional Bayesian inference via the unadjusted Langevin algorithm
- Sampling from log-concave distributions
- A convex/log-concave correlation inequality for Gaussian measure and an application to abstract Wiener spaces
- Theoretical Guarantees for Approximate Sampling from Smooth and Log-Concave Densities
- Couplings and quantitative contraction rates for Langevin dynamics
- Sampling from a log-concave distribution with projected Langevin Monte Carlo
- Mixing of Hamiltonian Monte Carlo on strongly log-concave distributions: continuous dynamics
- Coupling and convergence for Hamiltonian Monte Carlo
- Proximal Markov chain Monte Carlo algorithms
- Fast MCMC sampling algorithms on polytopes
- Convergence of Langevin MCMC in KL-divergence
- Efficient Bayesian computation by proximal Markov chain Monte Carlo: when Langevin meets Moreau
- Fast mixing of Metropolized Hamiltonian Monte Carlo: benefits of multi-step gradients
Cited In (37)
- Title not available (Why is that?)
- Finite-sample complexity of sequential Monte Carlo estimators
- Title not available (Why is that?)
- Fast mixing of Metropolized Hamiltonian Monte Carlo: benefits of multi-step gradients
- Error bounds for Metropolis-Hastings algorithms applied to perturbations of Gaussian measures in high dimensions
- High-dimensional MCMC with a standard splitting scheme for the underdamped Langevin diffusion
- Normalizing constants of log-concave densities
- An entropic approach for Hamiltonian Monte Carlo: the idealized case
- Randomized Hamiltonian Monte Carlo as scaling limit of the bouncy particle sampler and dimension-free convergence rates
- Scalable Bayesian computation for crossed and nested hierarchical models
- Mixing of Metropolis-adjusted Markov chains via couplings: the high acceptance regime
- Improved bounds for discretization of Langevin diffusions: near-optimal rates without convexity
- Oracle lower bounds for stochastic gradient sampling algorithms
- Stochastic zeroth-order discretizations of Langevin diffusions for Bayesian inference
- Dimension-free mixing times of Gibbs samplers for Bayesian hierarchical models
- Constrained ensemble Langevin Monte Carlo
- Truncated log-concave sampling for convex bodies with reflective Hamiltonian Monte Carlo
- Learning variational autoencoders via MCMC speed measures
- Multielement polynomial chaos kriging-based metamodelling for Bayesian inference of non-smooth systems
- Data-free likelihood-informed dimension reduction of Bayesian inverse problems
- Title not available (Why is that?)
- Ensemble Kalman sampler: mean-field limit and convergence analysis
- Explicit convergence bounds for Metropolis Markov chains: isoperimetry, spectral gaps and profiles
- Complexity of zigzag sampling algorithm for strongly log-concave distributions
- Is there an analog of Nesterov acceleration for gradient-based MCMC?
- Ergodicity of the infinite swapping algorithm at low temperature
- Nonasymptotic mixing of the MALA algorithm
- On Irreversible Metropolis Sampling Related to Langevin Dynamics
- Convergence rates of Metropolis-Hastings algorithms
- Convergence of Position-Dependent MALA with Application to Conditional Simulation in GLMMs
- Sampling from a log-concave distribution with projected Langevin Monte Carlo
- On the limitations of single-step drift and minorization in Markov chain convergence analysis
- Complexity results for MCMC derived from quantitative bounds
- On sampling from a log-concave density using kinetic Langevin diffusions
- Laplacian smoothing stochastic gradient Markov chain Monte Carlo
- Approximate spectral gaps for Markov chain mixing times in high dimensions
- Finite sample complexity of sequential Monte Carlo estimators on multimodal target distributions
This page was built for publication: Log-concave sampling: Metropolis-Hastings algorithms are fast
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5214293)