Uniform integrability of the OLS estimators, and the convergence of their moments

From MaRDI portal
Publication:2397992

DOI10.1007/S11749-016-0498-YzbMATH Open1369.62150arXiv1511.02962OpenAlexW3100565843MaRDI QIDQ2397992FDOQ2397992

G. Afendras, Marianthi Markatou

Publication date: 14 August 2017

Published in: Test (Search for Journal in Brave)

Abstract: The problem of convergence of moments of a sequence of random variables to the moments of its asymptotic distribution is important in many applications. These include the determination of the optimal training sample size in the cross validation estimation of the generalization error of computer algorithms, and in the construction of graphical methods for studying dependence patterns between two biomarkers. In this paper we prove the uniform integrability of the ordinary least squares estimators of a linear regression model, under suitable assumptions on the design matrix and the moments of the errors. Further, we prove the convergence of the moments of the estimators to the corresponding moments of their asymptotic distribution, and study the rate of the moment convergence. The canonical central limit theorem corresponds to the simplest linear regression model. We investigate the rate of the moment convergence in canonical central limit theorem proving a sharp improvement of von Bahr's (1965) theorem.


Full work available at URL: https://arxiv.org/abs/1511.02962




Recommendations




Cites Work


Cited In (8)





This page was built for publication: Uniform integrability of the OLS estimators, and the convergence of their moments

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2397992)