Improved bounds in Stein's method for functions of multivariate normal random vectors

From MaRDI portal
Publication:6204800




Abstract: In a recent paper, Gaunt 2020 extended Stein's method to limit distributions that can be represented as a function g:mathbbRdightarrowmathbbR of a centered multivariate normal random vector Sigma1/2mathbfZ with mathbfZ a standard d-dimensional multivariate normal random vector and Sigma a non-negative definite covariance matrix. In this paper, we obtain improved bounds, in the sense of weaker moment conditions, smaller constants and simpler forms, for the case that g has derivatives with polynomial growth. We obtain new non-uniform bounds for the derivatives of the solution of the Stein equation and use these inequalities to obtain general bounds on the distance, measured using smooth test functions, between the distributions of g(mathbfWn) and g(mathbfZ), where mathbfWn is a standardised sum of random vectors with independent components and mathbfZ is a standard d-dimensional multivariate normal random vector. We apply these general bounds to obtain bounds for the chi-square approximation of the family of power divergence statistics (special cases include the Pearson and likelihood ratio statistics), for the case of two cell classifications, that improve on existing results in the literature.



Cites work








This page was built for publication: Improved bounds in Stein's method for functions of multivariate normal random vectors

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6204800)