A comparison of biased regression estimators using a pitman nearness criterion
DOI10.1080/00949659008811208zbMath0726.62117OpenAlexW2136220334MaRDI QIDQ3350544
J. Michael Hardin, Michael D. Conerly
Publication date: 1990
Published in: Journal of Statistical Computation and Simulation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/00949659008811208
James-Stein estimatorridge regressionPitman measure of closenessprincipal components regressiongeneralized principal components estimatorBiased regression estimators
Factor analysis and principal components; correspondence analysis (62H25) Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Good and optimal ridge estimators
- A Comparison of james–sten regression with least squares in the pitman nearness sense
- Selecting the optimum k in ridge regression
- Algorithm AS 127: Generation of Random Orthogonal Matrices
- A Simulation Study of Some Ridge Estimators
- Ridge Regression and James-Stein Estimation: Review and Comments
- The Minimum Mean Square Error Linear Estimator and Ridge Regression
- Ridge regression:some simulations
- Data Analysis Using Stein's Estimator and its Generalizations
- Ridge Analysis Following a Preliminary Test of the Shrunken Hypothesis
- A Class of Biased Estimators in Linear Regression
- A Simulation Study of Alternatives to Ordinary Least Squares
- Biased Estimation in Regression: An Evaluation Using Mean Squared Error
- The pitman nearness criterion and its determination
- Ridge Regression: Applications to Nonorthogonal Problems
- On Inverse Estimation in Linear Regression
- Generalized Inverses, Ridge Regression, Biased Linear Estimation, and Nonlinear Estimation
- Some Probability Paradoxes in Choice from Among Random Alternatives
This page was built for publication: A comparison of biased regression estimators using a pitman nearness criterion