Simple Monte Carlo and the Metropolis algorithm (Q2465298)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Simple Monte Carlo and the Metropolis algorithm
scientific article

    Statements

    Simple Monte Carlo and the Metropolis algorithm (English)
    0 references
    0 references
    0 references
    9 January 2008
    0 references
    Integrals of the form \( \int_{\Omega}f(x) \cdot c\rho(x)\mu(dx) \) with respect to an unknown density \(c \rho(x),\) where \(c > 0,\) \(x \in \Omega,\) \(\mu\) is a probability measure and \((f, \rho)\) belongs to a class \({\mathcal F}(\Omega)\) are used in many applications. The problem is rewritten as a solution operator \( S(f, \rho) = \int_{\Omega}f(x)\rho(x) \mu(dx) / \int_{\Omega}\rho(x) \mu(dx). \) Randomized methods \(S_{n}\) that use \(n\) function evaluations of \(f\) and \(\rho\) as a non-adaptive simple Monte Carlo method and an adaptive Metropolis-Hastings method are considered. On a class \({\mathcal F}(\Omega)\) of input data the error criteria as the individual error of the problem \(e(S_{n}, (f, \rho)),\) the worst-case error on the class \({\mathcal F}(\Omega)\) \(e(S_{n}, {\mathcal F}(\omega))\) and the complexity of the problem \(e_{n}({\mathcal F}(\Omega))\) are defined and analyzed. In section 2 the concepts of the simple Monte Carlo algorithm \(S_{n}^{\text{ simple}}(f, \rho)\) and the Metropolis-Hastings algorithm \(S_{n}^{\text{ mh}}(f, \rho)\) based on using a Markov chain of points of \(\Omega\) are developed. A functional class \({\mathcal F}_{\mathcal C}(\Omega)\) and a class \({\mathcal F}^{\alpha}(\Omega)\) with log-concave densities are introduced. In section 3 the error of the integration on the class \({\mathcal F}_{\mathcal C}(\Omega)\) is analyzed. In Theorem 1 a lower bound of the worst-case error \(e(S_{n}, {\mathcal F}_{\mathcal C}(\Omega))\) for all adaptive or non-adaptive methods \(S_{n}\) is obtained. Theorem 2 provides an upper bound of the worst-case error \(e(S_{n}^{\text{ simple}}, {\mathcal F}_{\mathcal C}(\Omega))\) of the simple Monte Carlo method. For sufficiently big \(n\) both estimations have the same order \(n^{- {1 \over 2}}.\) In section 4 the error of the integration on the class \({\mathcal F}^{\alpha}(\Omega)\) is analyzed. Restrictions on the input data, in particular on the density, in order to improve the complexity are proposed. In Theorem 3 a lower bound for all non-adaptive methods, in particular for the simple Monte Carlo method, of the worst-case error \(e(S_{n}, {\mathcal F}^{\alpha}(\Omega))\) on the class \({\mathcal F}^{\alpha}(\Omega))\) is proved. The procedure, called ``ball-walk-step'', for a construction of a Markov chain \((K, \pi)\) with an invariant distribution \(\pi\) is recalled. The properties of this procedure as reversibility, uniform ergodicity, local conductance and conductance are explained. Theorem 4 gives a lower bound for the conductance of the Metropolis chain, constructed by the ball-walk-step method. Here the density \(\rho\) belongs to the class of log-concave weights on \(\Omega.\) This lower bound is exposed in terms of the local conductance. Theorem 5 gives an upper bound of the individual error of the problem of the Metropolis algorithm based on the Markov chain for the class \({\mathcal F}^{\alpha}(B^{d}),\) where \(B^{d}\) is the \(d\)-dimensional unit ball.
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    Monte Carlo methods
    0 references
    log-concave density
    0 references
    rapidly mixing Markov chain
    0 references
    optimal algorithms
    0 references
    adaptivity
    0 references
    complexity
    0 references
    Metropolis-Hastings method
    0 references
    worst-case error
    0 references
    ball-walk-step method
    0 references
    0 references
    0 references