On consistency of the least squares estimators in linear errors-in-variables models with infinite variance errors
From MaRDI portal
Publication:391833
DOI10.1214/13-EJS863zbMath1349.62316MaRDI QIDQ391833
Publication date: 13 January 2014
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.ejs/1385995293
infinite variancesignal-to-noise ratiomeasurement errorsexplanatory variablesdomain of attraction of the normal lawleast squares estimatorslinear structural and functional errors-in-variables modelsreliability ratioslowly varying function at infinityweak and strong consistency
Related Items
Cites Work
- Some limit behaviors for the LS estimator in simple linear EV regression models
- Functional asymptotic confidence intervals for the slope in linear error-in-variables models
- Self-normalized laws of the iterated logarithm
- When is the Student \(t\)-statistic asymptotically standard normal?
- Donsker's theorem for self-normalized partial sums processes
- Consistency of LS estimator in simple linear EV regression models
- Consistency for least squares regression estimators with infinite variance data
- On the generalized domain of attraction of the multivariate normal law and asymptotic normality of the multivariate Student \(t\)-statistic
- Central limit theorems in linear structural error-in-variables models with explanatory variables in the domain of attraction of the normal law
- New multivariate central limit theorems in linear structural and functional error-in-variables models
- Probability for Statisticians
- Some Convergence Theorems for Independent Random Variables
- An Inequality and Almost Sure Convergence
- Identifiability of a Linear Relation between Variables Which Are Subject to Error
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item