Optimal tuning of the hybrid Monte Carlo algorithm
From MaRDI portal
Publication:2435211
DOI10.3150/12-BEJ414zbMath1287.60090arXiv1001.4460OpenAlexW1981514681MaRDI QIDQ2435211
Alexandros Beskos, Andrew M. Stuart, Natesh S. Pillai, Gareth O. Roberts, Jesús María Sanz-Serna
Publication date: 4 February 2014
Published in: Bernoulli (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1001.4460
leapfrog schemeHamiltonian dynamicshigh dimensionsoptimal acceptance probabilitysquared jumping distance
Related Items
HMC: reducing the number of rejections by not using leapfrog and some results on the acceptance rate, Optimal scaling of random walk Metropolis algorithms using Bayesian large-sample asymptotics, Merging MCMC subposteriors through Gaussian-process approximations, Designing simple and efficient Markov chain Monte Carlo proposal kernels, An adaptive multiple-try Metropolis algorithm, Unnamed Item, Forward Event-Chain Monte Carlo: Fast Sampling by Randomness Control in Irreversible Markov Chains, FEM-based discretization-invariant MCMC methods for PDE-constrained Bayesian inverse problems, Multiple-time-stepping generalized hybrid Monte Carlo methods, Hierarchical models: local proposal variances for RWM-within-Gibbs and MALA-within-Gibbs, On the application of improved symplectic integrators in Hamiltonian Monte Carlo, Unnamed Item, Adaptive multi-stage integrators for optimal energy conservation in molecular simulations, Copula multivariate GARCH model with constrained Hamiltonian Monte Carlo, Rejoinder on: ``Some recent work on multivariate Gaussian Markov random fields, Thread-Parallel Anisotropic Mesh Adaptation, Optimal scaling and diffusion limits for the Langevin algorithm in high dimensions, Finite element model updating using Hamiltonian Monte Carlo techniques, Bayesian elastic net based on empirical likelihood, Adaptive multiple importance sampling for Gaussian processes, Bayesian computation for Log-Gaussian Cox processes: a comparative analysis of methods, Palindromic 3-stage splitting integrators, a roadmap, Non-reversible guided Metropolis kernel, System identification using autoregressive Bayesian neural networks with nonparametric noise models, Constrained Hamiltonian Monte Carlo in BEKK GARCH with targeting, Optimal scaling of random-walk Metropolis algorithms on general target distributions, Convergence of unadjusted Hamiltonian Monte Carlo for mean-field models, Concave-Convex PDMP-based Sampling, The Apogee to Apogee Path Sampler, Adaptive tuning of Hamiltonian Monte Carlo within sequential Monte Carlo, Modified Cholesky Riemann manifold Hamiltonian Monte Carlo: exploiting sparsity for fast sampling of high-dimensional targets, Markov chain Monte Carlo algorithms with sequential proposals, Localization for MCMC: sampling high-dimensional posterior distributions with local structure, Optimal scaling for the transient phase of Metropolis Hastings algorithms: the longtime behavior, Leveraging Bayesian analysis to improve accuracy of approximate models, Simulating Coulomb and log-gases with hybrid Monte Carlo algorithms, MALA-within-Gibbs Samplers for High-Dimensional Distributions with Sparse Conditional Structure, Mixing of Hamiltonian Monte Carlo on strongly log-concave distributions: continuous dynamics, Symmetrically Processed Splitting Integrators for Enhanced Hamiltonian Monte Carlo Sampling, Maximum Conditional Entropy Hamiltonian Monte Carlo Sampler, Rejoinder: Geodesic Monte Carlo on Embedded Manifolds, Weak convergence and optimal tuning of the reversible jump algorithm, Parameter estimation in stochastic differential equations with Markov chain Monte Carlo and non-linear Kalman filtering, Hierarchical models and tuning of random walk Metropolis algorithms, Adaptive Thermostats for Noisy Gradient Systems, Leave Pima Indians alone: binary regression as a benchmark for Bayesian computation, Extra chance generalized hybrid Monte Carlo, On the stability of sequential Monte Carlo methods in high dimensions, Asymptotic variance for random walk Metropolis chains in high dimensions: logarithmic growth via the Poisson equation, Bayesian computation: a summary of the current state, and samples backwards and forwards, MCMC methods for functions: modifying old algorithms to make them faster, Random walk Metropolis algorithm in high dimension with non-Gaussian target distributions, Two-scale coupling for preconditioned Hamiltonian Monte Carlo in infinite dimensions, Modified Hamiltonian Monte Carlo for Bayesian inference, Recycling intermediate steps to improve Hamiltonian Monte Carlo, Adaptive Step Size Selection for Hessian-Based Manifold Langevin Samplers, Noisy Hamiltonian Monte Carlo for Doubly Intractable Distributions, Geometric integrators and the Hamiltonian Monte Carlo method, Accelerated Dimension-Independent Adaptive Metropolis, Randomized Hamiltonian Monte Carlo as scaling limit of the bouncy particle sampler and dimension-free convergence rates, Spatial voting models in circular spaces: a case study of the U.S. House of Representatives, On the geometric ergodicity of Hamiltonian Monte Carlo, Adaptive random neighbourhood informed Markov chain Monte Carlo for high-dimensional Bayesian variable selection, Semi-supervised nonparametric Bayesian modelling of spatial proteomics, Split Hamiltonian Monte Carlo revisited, Mixing rates for Hamiltonian Monte Carlo algorithms in finite and infinite dimensions, Skew brownian motion and complexity of the alps algorithm, Optimal scaling for the transient phase of the random walk Metropolis algorithm: the mean-field limit
Cites Work
- Diffusion limits of the random walk Metropolis algorithm in high dimensions
- Hybrid Monte Carlo on Hilbert spaces
- A comparison of generalized hybrid Monte Carlo methods with and without momentum flip
- Exponential convergence of Langevin distributions and their discrete approximations
- Weak convergence and optimal scaling of random walk Metropolis algorithms
- Optimal scaling for various Metropolis-Hastings algorithms.
- Analysis of a nonreversible Markov chain sampler.
- Shadow hybrid Monte Carlo: an efficient propagator in phase space of macromolecules
- Bayesian learning for neural networks
- Optimal scaling and diffusion limits for the Langevin algorithm in high dimensions
- Optimal scalings for local Metropolis-Hastings chains on nonproduct targets in high dimensions
- Weak convergence of Metropolis algorithms for non-I.I.D. target distributions
- Monte Carlo strategies in scientific computing.
- Optimal acceptance rates for Metropolis algorithms: Moving beyond 0.234
- Accelerated Monte Carlo for optimal estimation of time series
- Manifold Stochastic Dynamics for Bayesian Learning
- Simulating Hamiltonian Dynamics
- MCMC METHODS FOR DIFFUSION BRIDGES
- Optimal Scaling of Discrete Approximations to Langevin Diffusions
- An Introduction to Stein's Method
- Scaling Limits for the Transient Phase of Local Metropolis–Hastings Algorithms
- Theoretical and numerical comparison of some sampling methods for molecular dynamics
- Geometric Numerical Integration
- Speeding up the hybrid Monte Carlo algorithm for dynamical fermions
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item