Iterative importance sampling algorithms for parameter estimation

From MaRDI portal
Publication:4607633

DOI10.1137/16M1088417zbMATH Open1385.65007arXiv1608.01958OpenAlexW2769566789WikidataQ130175704 ScholiaQ130175704MaRDI QIDQ4607633FDOQ4607633


Authors: Matthias Morzfeld, M. S. Day, R. W. Grout, George Shu Heng Pau, Stefan Finsterle, J. B. Bell Edit this on Wikidata


Publication date: 14 March 2018

Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)

Abstract: In parameter estimation problems one computes a posterior distribution over uncertain parameters defined jointly by a prior distribution, a model, and noisy data. Markov Chain Monte Carlo (MCMC) is often used for the numerical solution of such problems. An alternative to MCMC is importance sampling, which can exhibit near perfect scaling with the number of cores on high performance computing systems because samples are drawn independently. However, finding a suitable proposal distribution is a challenging task. Several sampling algorithms have been proposed over the past years that take an iterative approach to constructing a proposal distribution. We investigate the applicability of such algorithms by applying them to two realistic and challenging test problems, one in subsurface flow, and one in combustion modeling. More specifically, we implement importance sampling algorithms that iterate over the mean and covariance matrix of Gaussian or multivariate t-proposal distributions. Our implementation leverages massively parallel computers, and we present strategies to initialize the iterations using "coarse" MCMC runs or Gaussian mixture models.


Full work available at URL: https://arxiv.org/abs/1608.01958




Recommendations




Cites Work


Cited In (13)

Uses Software





This page was built for publication: Iterative importance sampling algorithms for parameter estimation

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4607633)