Efficient importance sampling for binary contingency tables

From MaRDI portal




Abstract: Importance sampling has been reported to produce algorithms with excellent empirical performance in counting problems. However, the theoretical support for its efficiency in these applications has been very limited. In this paper, we propose a methodology that can be used to design efficient importance sampling algorithms for counting and test their efficiency rigorously. We apply our techniques after transforming the problem into a rare-event simulation problem--thereby connecting complexity analysis of counting problems with efficiency in the context of rare-event simulation. As an illustration of our approach, we consider the problem of counting the number of binary tables with fixed column and row sums, cj's and ri's, respectively, and total marginal sums d=sumjcj. Assuming that maxjcj=o(d1/2), sumcj2=O(d) and the rj's are bounded, we show that a suitable importance sampling algorithm, proposed by Chen et al. [J. Amer. Statist. Assoc. 100 (2005) 109--120], requires O(d3varepsilon2delta1) operations to produce an estimate that has varepsilon-relative error with probability 1delta. In addition, if maxjcj=o(d1/4delta0) for some delta0>0, the same coverage can be guaranteed with O(d3varepsilon2log(delta1)) operations.









This page was built for publication: Efficient importance sampling for binary contingency tables

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2389598)