Convergence rates of variational posterior distributions (Q2215731)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Convergence rates of variational posterior distributions
scientific article

    Statements

    Convergence rates of variational posterior distributions (English)
    0 references
    0 references
    0 references
    0 references
    14 December 2020
    0 references
    Variational Bayes inference is a popular technique to approximate difficult-to-compute probability posterior distributions. The goal of the paper under review is to study the variational posterior distribution from a theoretic perspective. The authors propose general conditions on the prior, the likelihood, and the variational class to characterize the convergence rate of the variational posterior to the true data generating process. Besides, several aspects of variational Bayes inference are discussed. It is shown that for a general likelihood with a sieve prior, its mean-field variational approximation of the posterior distribution has an interesting relation to an empirical Bayes procedure. It is also shown that the empirical Bayes procedure is exactly a variational Bayes procedure using a specially designed variational class. Finally, the authors remark that the general rate for variational posteriors is only an upper bound. Sometimes the variational posterior may not be a good approximation to the true posterior, but it can still contract faster to the true parameter if additional regularity is imposed by the variational class. The constructed examples illustrate this point.
    0 references
    0 references
    posterior contraction
    0 references
    mean-field variational inference
    0 references
    density estimation
    0 references
    Gaussian sequence model
    0 references
    piecewise constant model
    0 references
    empirical Bayes
    0 references
    0 references