On upper bounds for the variance of functions of random variables (Q1062341): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
Created claim: Wikidata QID (P12): Q111547259, #quickstatements; #temporary_batch_1704601801252
Property / Wikidata QID
 
Property / Wikidata QID: Q111547259 / rank
 
Normal rank

Revision as of 06:32, 7 January 2024

scientific article
Language Label Description Also known as
English
On upper bounds for the variance of functions of random variables
scientific article

    Statements

    On upper bounds for the variance of functions of random variables (English)
    0 references
    0 references
    0 references
    1985
    0 references
    The upper bounds for the variance of a function g of a r.v. X obtained in this paper were originally motivated by \textit{H. Chernoff}'s inequality [Ann. Probab. 9, 533-535 (1981; Zbl 0457.60014)]: If g is absolutely continuous with derivative g' and X is N(0,1), then \(Var[g(X)]\leq E[g'(X)]^ 2\). \textit{L. H. Y. Chen} [J. Multivariate Anal. 12, 306-315 (1982; Zbl 0483.60011)], using the Cauchy-Schwarz (C-S) inequality obtained a multivariate extension: if \(X_ 1,...,X_ k\) are i.i.d. N(0,1) and g a function defined on \(R^ k\) with partial derivatives \(g_ 1,...,g_ k\), then \(Var[g(X_ 1,...,X_ k)]\leq \sum^{k}_{i=1}Eg^ 2_ i(X_ 1,...,X_ k).\) This paper improves the variance upper bounds given by the first author [Ann. Probab. 10, 799-809 (1982; Zbl 0492.60021)] for continuous or discrete r.v.'s, based on a similar use of the C-S inequality. The improvement relies on an appropriate use both of the C-S inequality and the Lagrange identity, resulting in the inequality \[ Var[g(X)]\leq \int^{\infty}_{-\infty}[g'(x)]^ 2\{\int^{\infty}_{x}(t- \mu)f(t)dt\}dx \] where, as in the above inequalities, equality holds iff g is linear. For a discrete r.v. X, g'(x) is replaced by \(\Delta g(x)=g(x+1)-g(x)\) and \(\int^{\infty}_{x}\) by \(\sum^{\infty}_{x=k+1}\). It is pointed out that the earlier bounds obtained by the first author coincide with the improved ones when \(E(X)=0\). A multivariate extension for any independent discrete or continuous r.v.'s is also given (cf. Chen, op. cit).
    0 references
    0 references
    \textit{H. Chernoff}'s inequality
    0 references
    Lagrange identity
    0 references
    0 references