Squared chaotic random variables: new moment inequalities with applications (Q897694): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Changed an Item
ReferenceBot (talk | contribs)
Changed an Item
 
Property / cites work
 
Property / cites work: Gaussian variables, polynomials and permanents / rank
 
Normal rank
Property / cites work
 
Property / cites work: Fourth moment theorems for Markov diffusion generators / rank
 
Normal rank
Property / cites work
 
Property / cites work: Generalization of the Nualart-Peccati criterion / rank
 
Normal rank
Property / cites work
 
Property / cites work: The plank problem for symmetric bodies / rank
 
Normal rank
Property / cites work
 
Property / cites work: Lower Bounds for Norms of Products of Polynomials / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the unlinking conjecture of independent polynomial functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Determinant Inequalities via Information Theory / rank
 
Normal rank
Property / cites work
 
Property / cites work: A combinational proof of the Mehler formula / rank
 
Normal rank
Property / cites work
 
Property / cites work: Pfaffians, Hafnians and products of real linear functionals / rank
 
Normal rank
Property / cites work
 
Property / cites work: Characterization of equality in the correlation inequality for convex functions, the U-conjecture / rank
 
Normal rank
Property / cites work
 
Property / cites work: Itô-Wiener chaos expansion with exact residual and correlation, variance inequalities / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5179661 / rank
 
Normal rank
Property / cites work
 
Property / cites work: On an independence criterion for multiple Wiener integrals / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Gaussian inequality for expected absolute products / rank
 
Normal rank
Property / cites work
 
Property / cites work: Normal Approximations with Malliavin Calculus / rank
 
Normal rank
Property / cites work
 
Property / cites work: Linear polarization constants of Hilbert spaces / rank
 
Normal rank
Property / cites work
 
Property / cites work: Lower bounds for norms of products of polynomials via Bombieri inequality / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4263967 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Geometry of spaces of polynomials / rank
 
Normal rank
Property / cites work
 
Property / cites work: On independence and conditioning on Wiener space / rank
 
Normal rank

Latest revision as of 03:40, 11 July 2024

scientific article
Language Label Description Also known as
English
Squared chaotic random variables: new moment inequalities with applications
scientific article

    Statements

    Squared chaotic random variables: new moment inequalities with applications (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    7 December 2015
    0 references
    \textit{P. E. Frenkel} [Math. Res. Lett. 15, No. 2--3, 351--358 (2008; Zbl 1160.46311)] proved that if \(X_1,\dots,X_n\) are jointly normal random variables with zero expectation, then \[ \operatorname{E}X_1^2 \cdots \operatorname{E}X_n^2\leq \operatorname{E}(X_1^2\cdots X_n^2). \] In the present paper, the authors give an extension of this result by showing that if \((G_1,\dots,G_n)\) is a real-valued centered Gaussian vector whose components have unit variance, then \[ \operatorname{E}[H_{p_1}(G_1)^2]\cdots \operatorname{E}[H_{p_n}(G_n)^2] \leq \operatorname{E}[H_{p_1}(G_1)^2\cdots H_{p_n}(G_n)^2] \] for all integers \(p_1,\dots,p_n\geq1\), in which the \(\{H_{p_i}\}\) are Hermite polynomials defined recursively by \(H_0=1\) and \(H_{k+1}=\delta H_k\), where \(\delta f(x)=xf(x)-f'(x)\). This result is used to present a refinement of the well-known Hadamard inequality. If \(A=[a_{ij}]\) is a positive definite \(m\times m\) matrix, the so-called Hadamard inequality asserts that \(\det A\leq \prod_{i=1}^{m} a_{ii}\). Let \(S=[s_{ij}]\) be a positive definite matrix, \(Z=\mathrm{diag}(s_{ii})\) and \(I\) be the identity matrix. The authors prove that if \(Z<I\) and \(Z+S<2I\), then \[ \Sigma=I-\frac{1}{2}(I-Z)^\frac{-1}{2}(S-Z)(I-Z)^\frac{-1}{2} \] is a positive definite matrix with \(\Sigma_{ii}=1\). Moreover, for every centered Gaussian vector \((X_1,\dots,X_n)\) of covariance \(\Sigma\), \[ \det S=\left(\sum_{k_1,\dots,k_n=0}^{\infty}\frac{\operatorname{E}[H_{k_1}(X_1)^2\cdots H_{k_n}(X_n)^2]}{k_1!\cdots k_n!}\prod_{i=1}^{n}\sqrt{s_{ii}}(1-s_{ii})^{k_i}\right)^{-2}. \]
    0 references
    0 references
    Gaussian vectors
    0 references
    Hadamard inequality
    0 references
    variance inequalities
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references