Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inference (Q2388330): Difference between revisions
From MaRDI portal
Latest revision as of 15:18, 10 June 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inference |
scientific article |
Statements
Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inference (English)
0 references
12 September 2005
0 references
A particle system is a collection \((\vartheta^{(j,H)},w^{(j,H)})_{j\leq H}\) where \(\vartheta^{(j,H)}\in\Theta\) are ``particles'' and \(w^{(j,H)}>0\) are their ``weights''. The system targets a distribution \(\pi\) on \(\Theta\) if for any measurable \(\varphi\) with \(| \mathbf{E}_\pi (\varphi)| <\infty\), \[ \hat E_H(\varphi)= { \sum_{j=1}^H w^{(j,H)}\varphi(\vartheta^{(j,H)}) \over \sum_{j=1}^H w^{(j,H)} } \to \mathbf{E}_\pi (\varphi). \] A sequential Monte Carlo algorithm (a particle filter) produces recursively (using mutation-correction-resampling scheme) a sequence of particle systems which target a sequence of distributions \(\pi_t\) on \(\Theta_t\). In the Bayes estimation problems \(\Theta_t=\Theta\) is the parameter space and \(\pi_t\) is an a posteriori distribution of the parameter \(\vartheta\) given the sample of size \(t\). In the state-space filtering or smoothing \(\Theta_t\) is the space of states trajectories and \(\pi_t\) is the conditional distribution of the trajectory given the data. The author obtains conditions for the central limit theorem of the form \(\sqrt{H}(\hat E_H(\varphi)-\mathbf{E}_\pi (\varphi)) \Rightarrow N(0,V_t(\varphi))\) where \(V_t(\varphi)\) is described using recursive formulae. These conditions hold for many of sequential Monte Carlo algorithms including the resample-move algorithm and the residual resampling scheme. Asymptotics of \(V_t(\varphi)\) as \(t\to\infty\) are investigated for Bayesian problems.
0 references
Markov Chain Monte Carlo method
0 references
particle filter
0 references
resample-move algorithm
0 references
residual resampling
0 references
state-space model
0 references
central limit theorem
0 references
Bayesian problems
0 references
0 references
0 references