Stochastic Stein Discrepancies

From MaRDI portal
Publication:6344508

arXiv2007.02857MaRDI QIDQ6344508FDOQ6344508


Authors: Jackson Gorham, Anant Raj, Lester Mackey Edit this on Wikidata


Publication date: 6 July 2020

Abstract: Stein discrepancies (SDs) monitor convergence and non-convergence in approximate inference when exact integration and sampling are intractable. However, the computation of a Stein discrepancy can be prohibitive if the Stein operator - often a sum over likelihood terms or potentials - is expensive to evaluate. To address this deficiency, we show that stochastic Stein discrepancies (SSDs) based on subsampled approximations of the Stein operator inherit the convergence control properties of standard SDs with probability 1. Along the way, we establish the convergence of Stein variational gradient descent (SVGD) on unbounded domains, resolving an open question of Liu (2017). In our experiments with biased Markov chain Monte Carlo (MCMC) hyperparameter tuning, approximate MCMC sampler selection, and stochastic SVGD, SSDs deliver comparable inferences to standard SDs with orders of magnitude fewer likelihood evaluations.




Has companion code repository: https://github.com/jgorham/stochastic_stein_discrepancy









This page was built for publication: Stochastic Stein Discrepancies

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6344508)