A generalized Isserlis theorem for location mixtures of Gaussian random vectors (Q654472): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Changed an Item
ReferenceBot (talk | contribs)
Changed an Item
 
Property / cites work
 
Property / cites work: Q4167332 / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Wick theorem for mixtures of centered Gaussian distributions / rank
 
Normal rank
Property / cites work
 
Property / cites work: From moments of sum to moments of product / rank
 
Normal rank
Property / cites work
 
Property / cites work: A general Isserlis theorem for mixed-Gaussian random variables / rank
 
Normal rank
Property / cites work
 
Property / cites work: An extension of Wick's theorem / rank
 
Normal rank
Property / cites work
 
Property / cites work: The moments of the multivariate normal / rank
 
Normal rank

Latest revision as of 19:44, 4 July 2024

scientific article
Language Label Description Also known as
English
A generalized Isserlis theorem for location mixtures of Gaussian random vectors
scientific article

    Statements

    A generalized Isserlis theorem for location mixtures of Gaussian random vectors (English)
    0 references
    28 December 2011
    0 references
    Let \(X=[X_1, X_2, \dots, X_d] \sim N(0, R)\) be a \(0\)-mean \(d\)-dimensional Gaussian random vector with covariance matrix \(R\), \(A=(a_1, a_2, \dots, a_N)\) be a set of \(N\) integers (not necessarily distinct), where each integer is between \(1\) and \(d\). Let \(X_A = \prod X_i\), where \(i\) runs over the set \(A\), if \(A\) is empty \(X_A =0\); \(p(A)\) be a pairing of \(A\) defined to be a partition of \(A\) into disjoint pairs \(\{ i, i'\}\); \(s(A)\) be the collection of all \(p(A)'\), if \(N\) is odd, then \(s(A)\) is empty; and \(\sum_A\prod \text{E}(X_iX_{i'})\) is the sum over \(s(A)\) of the products of the expected values where \(i, i'\) are the pairs in \(p(A)\). Consider Withers' extension of Isserlis theorem: \(\text{E}(X_A) = \sum_A\prod \text{E}(X_iX_{i'})\) if \(N\) is even and equals \(0\) if \(N\) is odd \((*)\). There are many extensions of this theorem [\textit{C. S. Withers}, Bull. Aust. Math. Soc. 32, 103--107 (1985; Zbl 0573.62047)]. Consider the following extension due to \textit{J. V. Michalowicz, J. M. Nichols, F. Bucholtz} and \textit{C. C. Olson} [Stat. Probab. Lett. 81, No. 8, 1233--1240 (2011; Zbl 1221.60021)]. Theorem: Let \(X\) have the ``mixture'' density \(0.5\varphi (x+\mu)+ 5\varphi (x-\mu)\), where \(\varphi\) is the density of \(X\sim N(0,R)\). Then the result \((*)\) of Withers holds. In this paper a new and shorter proof of the above result is given. Also, this result is further extended to the case of a random vector \(X = Y + Z\), where \(Z \sim N(0, R)\) is independent of \(Y\) which is an arbitrary \(d\)-dimensional random vector. Note that \(Y\) may be discrete or continuous and it may or may not have a density function. This is the generalized result which is main theorem of the paper. Theorem: Let \(S\) be a subset of \(A\) and \(S'\) be its complement in \(A\). Under the model described above and assuming all the moments below exist, \(\text{E}(X_A) =\sum_1\sum_2 \text{E}(Y_S) \sum_{S'} \prod\text{E} (X_i X_{i'})\). For \(N =2M\), the sum \(\sum_1\) runs over all values of \(k\) from \(0\) to \(M\) and for a fixed \(k\) the sum \(\sum_2\) runs over all subsets of size \(2k\). For \(N =2M+1\), the sum \(\sum_1\) runs over all values of \(k\) from \(0\) to \(M\) and for a fixed k the sum \(\sum_2\) runs over all subsets of size \(2k+1\). The paper ends with a similar theorem for a generalized \(d\)-dimensional hyperbolic distribution.
    0 references
    0 references
    0 references
    Isserlis theorem
    0 references
    normal-variance mixture
    0 references
    generalized hyperbolic distribution
    0 references
    0 references
    0 references
    0 references