Metropolized randomized maximum likelihood for improved sampling from multimodal distributions
From MaRDI portal
Publication:5269863
Recommendations
Cites work
- Title not available (Why is no real title available?)
- scientific article; zbMATH DE number 46855 (Why is no real title available?)
- scientific article; zbMATH DE number 472922 (Why is no real title available?)
- A Randomized Maximum A Posteriori Method for Posterior Sampling of High Dimensional Nonlinear Bayesian Inverse Problems
- A general framework for the parametrization of hierarchical models
- A stochastic Newton MCMC method for large-scale statistical inverse problems with application to seismic inversion
- An ensemble Kalman filter using the conjugate gradient sampler
- Bayesian Inference in Econometric Models Using Monte Carlo Integration
- Coarse-gradient Langevin algorithms for dynamic data integration and uncertainty quantification
- Distributed parameter and state estimation in petroleum reservoirs
- Efficiency of the Wang-Landau algorithm: a simple test case
- Investigation of the sampling performance of ensemble-based methods with a simple reservoir model
- MCMC methods for functions: modifying old algorithms to make them faster
- Markov chains for exploring posterior distributions. (With discussion)
- Minimization for conditional simulation: relationship to optimal transport
- Mode jumping proposals in MCMC
- On the Use of Local Optimizations within Metropolis–Hastings Updates
- On the efficiency of pseudo-marginal random walk Metropolis algorithms
- On the flexibility of Metropolis-Hastings acceptance probabilities in auxiliary variable proposal generation
- Optimal scaling for various Metropolis-Hastings algorithms.
- Probabilistic Forecasting and Bayesian Data Assimilation
- The Multiple-Try Method and Local Optimization in Metropolis Sampling
- The no-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo
- The pseudo-marginal approach for efficient Monte Carlo computations
- Uncertainty quantification for porous media flows
- Weak convergence and optimal scaling of random walk Metropolis algorithms
Cited in
(16)- Comparison of regularized ensemble Kalman filter and tempered ensemble transform particle filter for an elliptic inverse problem with uncertain boundary conditions
- Optimization-Based Markov Chain Monte Carlo Methods for Nonlinear Hierarchical Statistical Inverse Problems
- Inference via low-dimensional couplings
- Scalable Optimization-Based Sampling on Function Space
- Learning physics-based models from data: perspectives from inverse problems and model reduction
- A data-space inversion procedure for well control optimization and closed-loop reservoir management
- Calibration of imperfect models to biased observations
- Data assimilation in truncated plurigaussian models: impact of the truncation map
- On the Use of Local Optimizations within Metropolis–Hastings Updates
- Variational inference for nonlinear inverse problems via neural net kernels: comparison to Bayesian neural networks, application to topology optimization
- Gaussian mixture model fitting method for uncertainty quantification by conditioning to production data
- Metropolized Knockoff Sampling
- A modified randomized maximum likelihood for improved Bayesian history matching
- A new technique for sampling multi-modal distributions
- Distributed Gauss-Newton optimization method for history matching problems with multiple best matches
- Randomized maximum likelihood based posterior sampling
This page was built for publication: Metropolized randomized maximum likelihood for improved sampling from multimodal distributions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5269863)