Sequential Monte Carlo smoothing for general state space hidden Markov models (Q657691)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Sequential Monte Carlo smoothing for general state space hidden Markov models |
scientific article |
Statements
Sequential Monte Carlo smoothing for general state space hidden Markov models (English)
0 references
10 January 2012
0 references
For any sequence \(\{a_n\}\), \(n\geq 0\), and any pair of integers \(0\leq m\leq n\), let \(a_{m:n}=(a_m,\ldots, a_n)\). All random variables are defined on a common probability space \((\Omega, \mathcal{F}, \mathrm{P})\), The sets \(\mathbb{X}\) and \(\mathbb{Y}\) are Polish spaces, \(B(\mathbb{X})\) and \(B(\mathbb{Y})\) are the associated Borel \(\sigma\)-algebras. The state process \(\{X_t\}_{t\geq 0}\) is a Markov chain on the state space \(\{\mathbb{X},B(\mathbb{X})\}\) with the initial distribution \(\chi\) and the transition kernel \(M\). The state process is assumed to be hidden but partially observed through the observation \(\{Y_t\}_{t}\), which are \(\mathbb{Y}\)-valued random variables conditionally independent given the latent state sequence \(\{X_t\}_{t\geq0}\). Let there exists a \(\sigma\)-finite measure \(\lambda\) on \(\{\mathbb{Y},B(\mathbb{Y})\}\) and a nonnegative transition density function \(g\) on \(\mathbb{X}\times\mathbb{Y}\) such that \[ \mathrm{P}\{Y_t\in A|X_t\}=\int\limits_{A}g(X_t,y)\lambda dy \text{ for all } A\in B(\mathbb{Y}). \] The kernel \(M\) and the transition destiny \(g\) are known. In this paper it is assumed that there exists arbitrary but fixed observations \(Y_{0:T}=\{y_0, \ldots, y_{T}\}\). The main task is the estimation of the posterior distribution of (different subsets of) the state vector \(X_{0:T}\) given these observations. Depending on the observations \(Y_{0:T}\), the state sequence \(\{X_s\}\), \(s\geq 0\), is a time inhomogeneous Markov chain. The aim of this paper is to provide a foundation of the particle-based approximation of such distributions and to analyse, in a common unifying framework, different schemes producing such approximations. The sequential Monte Carlo (SMC) method is applied as well. General convergence results, including exponential deviation inequalities and the central limit theorem (CLT) are obtained. The paper is organized as follows. The introduction contains the statements of the problems and short remarks on their relations with already known facts. In Section 2, the forward filtering backward smoothing (FFBSm) algorithm and the forward filtering backward simulation (FFBSi) sampler are introduced. A fast version of the FFBSi simulation algorithm is presented and SMC filters are described. The main statements of the paper are Prop. 1 and Prop 2. Let \(Z_{S}^{N}\) be the number of simulations required in the accept-reject procedure of Algorithm 1. The limits in probability for \(Z^{N}_{S}/N\), obtained for the bootstrap auxiliary filter and in the fully adapted case, are considered in Prop. 1. Prop. 2 gives an upper estimate for the mean of the elementary operations number, required in Algorithm 1, provided the transition kernel is bounded from below and above. In Section 3, the convergence of the FFBSm and FFBSi algorithms is considered. An exponential deviation inequality for the fixed internal joint smoothing distribution is derived and the CLT is established. In Section 4, the time uniform exponential bounds for the error of the FFBSm marginal smoothing distribution are computed under mixing conditions on the kernel \(M\). Under the same mixing the explicit bound on the asymptotic variance of the marginal smoothing distribution estimator is obtained. In Section 5, the proofs of Prop. 1 and Prop. 2 are presented. In appendices A and B some technical results are proved. The description and analysis of an efficient multinomial sampling procedure, detailed in Algorithm 2 are presented. The number of elementary operations required by Algorithm 2 is estimated.
0 references
sequential Monte Carlo methods
0 references
particle filter
0 references
smoothing
0 references
hidden Markov models
0 references
0 references
0 references
0 references
0 references
0 references