On the ergodicity properties of some adaptive MCMC algorithms (Q862214): Difference between revisions

From MaRDI portal
Created claim: Wikidata QID (P12): Q122606178, #quickstatements; #temporary_batch_1707252663060
ReferenceBot (talk | contribs)
Changed an Item
 
(5 intermediate revisions by 4 users not shown)
Property / reviewed by
 
Property / reviewed by: Vassil St. Grozdanov / rank
Normal rank
 
Property / reviewed by
 
Property / reviewed by: Vassil St. Grozdanov / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2064871928 / rank
 
Normal rank
Property / arXiv ID
 
Property / arXiv ID: math/0610317 / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the ergodicity properties of some adaptive MCMC algorithms / rank
 
Normal rank
Property / cites work
 
Property / cites work: Stability of Stochastic Approximation under Verifiable Conditions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Finite Mixture Distributions, Sequential Likelihood and the EM Algorithm / rank
 
Normal rank
Property / cites work
 
Property / cites work: On adaptive Markov chain Monte Carlo algorithms / rank
 
Normal rank
Property / cites work
 
Property / cites work: Renewal theory and computable convergence rates for geometrically erdgodic Markov chains / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3997575 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Some Multivariate Chebyshev Inequalities with Extensions to Continuous Parameter Processes / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence and robustness of the Robbins-Monro algorithm truncated at randomly varying bounds / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4720608 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4331760 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5671422 / rank
 
Normal rank
Property / cites work
 
Property / cites work: On an adaptive version of the Metropolis–Hastings algorithm with independent proposal distribution / rank
 
Normal rank
Property / cites work
 
Property / cites work: Inference from iterative simulation using multiple sequences / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Liapounov bound for solutions of the Poisson equation / rank
 
Normal rank
Property / cites work
 
Property / cites work: An adaptive Metropolis algorithm / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3911791 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Geometric ergodicity of Metropolis algorithms / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4346705 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A maximal inequality and dependent strong laws / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4363980 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Rates of convergence of the Hastings and Metropolis algorithms / rank
 
Normal rank
Property / cites work
 
Property / cites work: Markov chains and stochastic stability / rank
 
Normal rank
Property / cites work
 
Property / cites work: Computable bounds for geometric convergence rates of Markov chains / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Poisson equation in the potential theory of a single kernel. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Geometric convergence and central limit theorems for multidimensional Hastings and Metropolis algorithms / rank
 
Normal rank
Property / cites work
 
Property / cites work: Statistical analysis of finite mixture distributions / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the convergence properties of the EM algorithm / rank
 
Normal rank

Latest revision as of 12:43, 25 June 2024

scientific article
Language Label Description Also known as
English
On the ergodicity properties of some adaptive MCMC algorithms
scientific article

    Statements

    On the ergodicity properties of some adaptive MCMC algorithms (English)
    0 references
    0 references
    0 references
    5 February 2007
    0 references
    The Markov chain Monte Carlo (MCMC) algorithm is a popular computational method for generating samples with stationary distribution \(\pi.\) The authors propose the integral \(\pi(f) = \int_{X}f(x)\pi(dx),\) where \(X \subset {\mathbb R}^{n}\) is a space which can be high dimensional to be approximated through \(S_{n}(f) = n^{-1}\sum_{k=1}^{n}f(X_{k}).\) Here \(\{X_{k}:k \geq 1\}\) is an ergodic Markov chain on \(X\) with transition probability \(P\) and stationary distribution \(\pi.\) In the introduction a survey of some algorithms as the Metropolis-Hastings (MH) algorithm, the symmetric increments random walk MH algorithm (SRWM) and other results are reminded. In section 2 the main concept of the so-called adaptive MCMC algorithm, depending on the tuning parameter \(\theta,\) defined in the space \(\Theta\) is developed. The authors set \(\theta_{0} = \theta \in \Theta,\) \(X_{0} = x \in X\) and for \(k \geq 0\) the sequence \(\{(X_{k}, \theta_{k}):k \geq 0\}\) is defined recursively: if \(\theta_{k} = \theta_{c},\) then they set \( \theta_{k+1} = \theta_{c}\) and \(X_{k+1} = x,\) otherwise \((X_{k+1}, \theta_{k+1}) \sim Q_{\rho_{k+1}}(X_{k}, \theta_{k});\cdot),\) where \(\rho = (\rho_{k})\) is a sequence of stepsizes. The concept of a homogeneous Markov chain \(\{Z_{k}: k \geq 0\}\) is given. In section 3 three assumptions are introduced. The main result is given in Theorem 8 which is a strong law of large numbers (LLN) for \(S_{n}(f).\) Theorem 8 confirms the convergence in the almost-sure sense of \(S_{n}(f)\) to \(\pi(f).\) Theorem 9 is the main result of section 4. In this theorem stringent conditions required in the LLN for \(S_{n}(f)\) are given. An invariance principle is established. In section 5 two new conditions are introduced. In Theorem 11 the w.p. 1 convergence of the noisy sequence \(\{\theta_{k}\}\) is presented. In section 6 an application of the preliminarily obtained results to the theory of the SRWM algorithm is given. Theorem 15 confirms that the strong LLN holds for any \(\alpha \in [0,1)\) and for any function \(f \in {\mathcal L}(W^{\alpha})\) and that the central limit theorem holds for any function \(f \in {\mathcal L}(W^{\alpha \over 2}).\) In section 7 an application of the obtained results to the independent MH algorithm is presented. General properties required for the LLN and the invariance principle are developed. The main result of this section is Theorem 21. For any function \(f \in {\mathcal L}_{V^{\alpha}}\) the convergence of \(S_{n}(f)\) to \(\pi(f)\) in the almost-sure sense is shown and for any function \(f \in {\mathcal L}_{V^{\alpha \over 2}}\) the convergence of \(S_{n}(f)\) to \(\pi(f)\) in the distribution sense is presented.
    0 references
    Adaptive Markov chain Monte Carlo algorithm
    0 references
    self-tuning algorithm
    0 references
    Metropolis-Hastings algorithm
    0 references
    Stochastic approximation
    0 references
    state-dependent noise
    0 references
    randomly varying truncation
    0 references
    martingale
    0 references
    Poisson method
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references