Variational Bayes with synthetic likelihood

From MaRDI portal
Publication:1704030

DOI10.1007/S11222-017-9773-3zbMATH Open1384.65015arXiv1608.03069OpenAlexW2512401777WikidataQ62899349 ScholiaQ62899349MaRDI QIDQ1704030FDOQ1704030


Authors: Victor M. H. Ong, David J. Nott, Minh-Ngoc Tran, S. A. Sisson, C. C. Drovandi Edit this on Wikidata


Publication date: 8 March 2018

Published in: Statistics and Computing (Search for Journal in Brave)

Abstract: Synthetic likelihood is an attractive approach to likelihood-free inference when an approximately Gaussian summary statistic for the data, informative for inference about the parameters, is available. The synthetic likelihood method derives an approximate likelihood function from a plug-in normal density estimate for the summary statistic, with plug-in mean and covariance matrix obtained by Monte Carlo simulation from the model. In this article, we develop alternatives to Markov chain Monte Carlo implementations of Bayesian synthetic likelihoods with reduced computational overheads. Our approach uses stochastic gradient variational inference methods for posterior approximation in the synthetic likelihood context, employing unbiased estimates of the log likelihood. We compare the new method with a related likelihood free variational inference technique in the literature, while at the same time improving the implementation of that approach in a number of ways. These new algorithms are feasible to implement in situations which are challenging for conventional approximate Bayesian computation (ABC) methods, in terms of the dimensionality of the parameter and summary statistic.


Full work available at URL: https://arxiv.org/abs/1608.03069




Recommendations




Cites Work


Cited In (18)

Uses Software





This page was built for publication: Variational Bayes with synthetic likelihood

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1704030)