Distributional transformations without orthogonality relations (Q521960)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Distributional transformations without orthogonality relations
scientific article

    Statements

    Distributional transformations without orthogonality relations (English)
    0 references
    0 references
    12 April 2017
    0 references
    In this paper, the authors study distributional transformations inspired by Stein's method. Let \(X\) be a random variable, \(m \in \left\{ 0,1,\ldots \right\} \) and \(P\) a measurable function. Denote by \(\mathcal{C}^{m}\) the collection of functions whose \(m\)th derivative exists and is measurable on \(\mathbb{R}\), and suppressing \(X\) on the left hand side. Set \(\mathcal{F}^{m}\left(P \right)=\left\{ F \in \mathcal{C}^{m} : E\left[ \; | P\left(X \right)F\left(X \right) | \; \right]< \infty \right\} \). Consider transformations a given \(X\) into \(X^{(P)}\), characterized by \[ E\left[ P\left(X\right)F\left(X\right)\right] = \alpha E[F^{(m)}(X^{(P)} )] \; \; \mathrm{for \; all} \; F \in \mathcal{F}^{m}\left(P \right), \] where necessarily \(\alpha = m!E\left[ P\left(X\right)X^{m}\right] \) when \(X^{m} \in \mathcal{F}^{m}\left(P \right)\); \(\alpha >0.\) The distribution of the random variable \(X^{(P)}\) is called the \(X\) - \(P\) biased distribution. \textit{L. Goldstein} and \textit{G. Reinert} [J. Theor. Probab. 18, No. 1, 237--260 (2005; Zbl 1072.62002)] prove a result that guarantees the existence and uniqueness of the distribution for such a random variables \(X^{(P)}\), if \(P\) is a function with exactly \(m\) sign changes on \(\mathbb{R}\) and if there exists an \(\alpha > 0\) such that the \textit{orthogonality relations} \( E\left[X^{k} P\left( X\right) \right] =m! \alpha \delta_{k,m} \) hold for \(k=0,1,\ldots,m\). From the authors' abstract: We prove two abstract existence and uniqueness results for such distributional transformations, generalizing their \(X-P\)-bias transformation. On the one hand, we show how one can abandon previously necessary orthogonality relations by subtracting an explicitly known polynomial depending on the test function from the test function itself. On the other hand, we prove that for a given nonnegative integer m, it is possible to obtain the expectation of the \(m\)-th derivative of the test function with respect to the transformed distribution in the defining equation, even though the biasing function may have \(k<m\) sign changes, if these two numbers have the same parity. We explain how these results can be used to guarantee the existence of two different generalizations of the zero-bias transformation by \textit{L. Goldstein} and \textit{G. Reinert} [Ann. Appl. Probab. 7, No. 4, 935--952 (1997; Zbl 0903.60019)]. Further applications include the derivation of Stein-type characterizations without needing to solve any Stein equation and the presentation of a general framework for estimating the distance from the distribution of a given real random variable \(X\) to that of a random variable \(Z\), whose distribution is characterized by some \(m\)th-order linear differential operator. We also explain the fact that, in general, the biased distribution depends on the choice of the sign change points, if these are ambiguous. This new phenomenon does not appear in the framework from [loc. cit., Zbl 1072.62002)].
    0 references
    0 references
    0 references
    0 references
    0 references
    Stein's method
    0 references
    distributional transformations
    0 references
    zero-bias transformation
    0 references
    size-bias transformation
    0 references
    equilibrium distribution
    0 references
    Stein characterizations
    0 references
    higher-order Stein operators
    0 references
    0 references
    0 references