A new class of generalized Bayes minimax ridge regression estimators

From MaRDI portal
Publication:2583418

DOI10.1214/009053605000000327zbMATH Open1078.62006arXivmath/0508282OpenAlexW2046802579MaRDI QIDQ2583418FDOQ2583418

Yuzo Maruyama, William E. Strawderman

Publication date: 16 January 2006

Published in: The Annals of Statistics (Search for Journal in Brave)

Abstract: Let y=A�eta+epsilon, where y is an N imes1 vector of observations, �eta is a p imes1 vector of unknown regression coefficients, A is an N imes p design matrix and epsilon is a spherically symmetric error term with unknown scale parameter sigma. We consider estimation of �eta under general quadratic loss functions, and, in particular, extend the work of Strawderman [J. Amer. Statist. Assoc. 73 (1978) 623-627] and Casella [Ann. Statist. 8 (1980) 1036-1056, J. Amer. Statist. Assoc. 80 (1985) 753-758] by finding adaptive minimax estimators (which are, under the normality assumption, also generalized Bayes) of �eta, which have greater numerical stability (i.e., smaller condition number) than the usual least squares estimator. In particular, we give a subclass of such estimators which, surprisingly, has a very simple form. We also show that under certain conditions the generalized Bayes minimax estimators in the normal case are also generalized Bayes and minimax in the general case of spherically symmetric errors.


Full work available at URL: https://arxiv.org/abs/math/0508282




Recommendations




Cites Work


Cited In (34)





This page was built for publication: A new class of generalized Bayes minimax ridge regression estimators

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2583418)