Riemann manifold Langevin and Hamiltonian Monte Carlo methods. With discussion and authors' reply
DOI10.1111/J.1467-9868.2010.00765.XzbMATH Open1411.62071OpenAlexW1545319692WikidataQ56700917 ScholiaQ56700917MaRDI QIDQ4631607FDOQ4631607
Authors: Ben Calderhead, M. Girolami
Publication date: 12 April 2019
Published in: Journal of the Royal Statistical Society Series B: Statistical Methodology (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1111/j.1467-9868.2010.00765.x
Recommendations
- A general metric for Riemannian manifold Hamiltonian Monte Carlo
- Modified Cholesky Riemann manifold Hamiltonian Monte Carlo: exploiting sparsity for fast sampling of high-dimensional targets
- The no-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo
- Proximal Markov chain Monte Carlo algorithms
- MCMC using Hamiltonian dynamics
Bayesian inferenceMarkov chain Monte Carlo methodsLangevin diffusionRiemann manifoldsgeometry in statisticsHamiltonian Monte Carlo methods
Bayesian inference (62F15) Directional data; spatial statistics (62H11) Monte Carlo methods (65C05) Research exposition (monographs, survey articles) pertaining to statistics (62-02)
Cited In (only showing first 100 items - show all)
- Zero variance differential geometric Markov chain Monte Carlo algorithms
- Directional distance functions: optimal endogenous directions
- Bayesian inference for brain activity from functional magnetic resonance imaging collected at two spatial resolutions
- A comparative evaluation of stochastic-based inference methods for Gaussian process models
- Gibbs samplers for logistic item response models via the Pólya-gamma distribution: a computationally efficient data-augmentation strategy
- Bayesian inverse problems and Kalman filters
- Parameter uncertainty in biochemical models described by ordinary differential equations
- Particle Metropolis-Hastings using gradient and Hessian information
- Stochastic approximation Hamiltonian Monte Carlo
- Bayesian inference with optimal maps
- Noisy Monte Carlo: convergence of Markov chains with approximate transition kernels
- Efficient Bayesian inference of general Gaussian models on large phylogenetic trees
- Title not available (Why is that?)
- De Finetti priors using Markov chain Monte Carlo computations
- Striated Metropolis-Hastings sampler for high-dimensional models
- On a generalization of the preconditioned Crank-Nicolson metropolis algorithm
- Algorithms for Kullback-Leibler approximation of probability measures in infinite dimensions
- Certified dimension reduction in nonlinear Bayesian inverse problems
- Measuring sample quality with diffusions
- Random points on an algebraic manifold
- Iterative importance sampling algorithms for parameter estimation
- Efficient real-time monitoring of an emerging influenza pandemic: how feasible?
- The good, the bad and the technology: endogeneity in environmental production models
- Geodesic Monte Carlo on embedded manifolds
- Accelerating Monte Carlo estimation with derivatives of high-level finite element models
- A Bayesian approach to estimate parameters of ordinary differential equation
- Sequentially constrained Monte Carlo
- Characterization of sporadic perfect polynomials over \(\mathbb{F}_2\)
- Coordinate sampler: a non-reversible Gibbs-like MCMC sampler
- Information-geometric Markov chain Monte Carlo methods using diffusions
- Variance reduction using nonreversible Langevin samplers
- Controlled sequential Monte Carlo
- A uniformly accurate scheme for the numerical integration of penalized Langevin dynamics
- Metropolis integration schemes for self-adjoint diffusions
- Hessian-based adaptive sparse quadrature for infinite-dimensional Bayesian inverse problems
- Dimension-independent likelihood-informed MCMC
- Data-driven probability concentration and sampling on manifold
- Extra chance generalized hybrid Monte Carlo
- Scalable estimation strategies based on stochastic approximations: classical results and new insights
- Using perturbed underdamped Langevin dynamics to efficiently sample from probability distributions
- A geometric variational approach to Bayesian inference
- Accelerated dimension-independent adaptive metropolis
- Generally covariant state-dependent diffusion
- Posterior consistency for Gaussian process approximations of Bayesian posterior distributions
- Bayesian adaptation of chaos representations using variational inference and sampling on geodesics
- Efficient Adaptive MCMC Through Precision Estimation
- Improving the convergence of reversible samplers
- FEM-based discretization-invariant MCMC methods for PDE-constrained Bayesian inverse problems
- On Irreversible Metropolis Sampling Related to Langevin Dynamics
- A Bayesian Approach to Estimating Background Flows from a Passive Scalar
- Adaptive schemes for piecewise deterministic Monte Carlo algorithms
- Bayesian Probabilistic Numerical Methods
- Hybrid Monte Carlo on Hilbert spaces
- Emulation of higher-order tensors in manifold Monte Carlo methods for Bayesian inverse problems
- Advanced MCMC methods for sampling on diffusion pathspace
- Hybrid Monte Carlo methods for sampling probability measures on submanifolds
- Stochastic dominance tests
- A neural network assisted Metropolis adjusted Langevin algorithm
- Wang-Landau algorithm: an adapted random walk to boost convergence
- Expectation propagation for nonlinear inverse problems -- with an application to electrical impedance tomography
- \(\Pi\)4U: a high performance computing framework for Bayesian uncertainty quantification of complex models
- Sampling from manifold-restricted distributions using tangent bundle projections
- Intrinsic modeling of stochastic dynamical systems using empirical geometry
- Bayesian prediction for physical models with application to the optimization of the synthesis of pharmaceutical products using chemical kinetics
- Proposals which speed up function-space MCMC
- The geometric foundations of Hamiltonian Monte Carlo
- Hamiltonian-Assisted Metropolis Sampling
- High-dimensional Bayesian parameter estimation: case study for a model of JAK2/STAT5 signaling
- An Acceleration Strategy for Randomize-Then-Optimize Sampling Via Deep Neural Networks
- Langevin diffusions and the Metropolis-adjusted Langevin algorithm
- Gaussian processes with built-in dimensionality reduction: applications to high-dimensional uncertainty propagation
- Ensemble samplers with affine invariance
- The Bouncy Particle Sampler: A Non-Reversible Rejection-Free Markov Chain Monte Carlo Method
- On a high-dimensional model representation method based on copulas
- Title not available (Why is that?)
- MCMC using Hamiltonian dynamics
- Solving inverse problems using data-driven models
- Convergent stochastic expectation maximization algorithm with efficient sampling in high dimension. Application to deformable template model estimation
- Ensemble preconditioning for Markov chain Monte Carlo simulation
- Langevin incremental mixture importance sampling
- Survey of multifidelity methods in uncertainty propagation, inference, and optimization
- Markov-chain Monte-Carlo methods and non-identifiabilities
- Control Theory and Experimental Design in Diffusion Processes
- The no-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo
- Support points
- Bayesian inference of the fractional Ornstein-Uhlenbeck process under a flow sampling scheme
- Adaptive step size selection for Hessian-based manifold Langevin samplers
- Title not available (Why is that?)
- Importance sampling from posterior distributions using copula-like approximations
- SIMD parallel MCMC sampling with applications for big-data Bayesian analytics
- On the geometric ergodicity of Hamiltonian Monte Carlo
- Irreducibility and geometric ergodicity of Hamiltonian Monte Carlo
- Markov chain Monte Carlo sampling using a reservoir method
- Asymptotic analysis of the random walk metropolis algorithm on ridged densities
- Importance sampling for Kolmogorov backward equations
- Marginal reversible jump Markov chain Monte Carlo with application to motor unit number estimation
- On the estimation of total factor productivity: a novel Bayesian non-parametric approach
- Statistical Inference, Learning and Models in Big Data
- ATLAS: a geometric approach to learning high-dimensional stochastic systems near manifolds
- Incompressible Euler equations with stochastic forcing: a geometric approach
Uses Software
This page was built for publication: Riemann manifold Langevin and Hamiltonian Monte Carlo methods. With discussion and authors' reply
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4631607)