Entropy loss and risk of improved estimators for the generalized variance and precision
From MaRDI portal
Publication:1118291
DOI10.1007/BF00052348zbMath0668.62033OpenAlexW2037105251MaRDI QIDQ1118291
Yoshihiko Konno, Nariaki Sugiura
Publication date: 1988
Published in: Annals of the Institute of Statistical Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf00052348
derivativeriskentropy losszonal polynomialsbest affine equivariant estimatorNumerical comparisonStein's truncated estimatorincomplete beta functions of matrix argumentsmixture representation of noncentral Wishart and multivariate beta distributionsmultivariate linear hypothesesreduction of risk
Related Items (6)
Wishart exponential families on cones related to tridiagonal matrices ⋮ Estimating the covariance matrix and the generalized variance under a symmetric loss ⋮ Equivariant estimation under the pitman closeness criterion ⋮ Improved Estimation of Generalized Variance and Precision ⋮ Shrinkage and modification techniques in estimation of variance and the related problems: A review ⋮ Optimal critical values of pre-tests when estimating the regression error variance: Analytical findings under a general loss structure
Cites Work
- Asymptotic risk comparison of improved estimators for normal covariance matrix
- Trimmed minimax estimator of a covariance matrix
- On improved estimators of the generalized variance
- An improved estimator of the generalized variance
- Inadmissibility of the usual estimator for the variance of a normal distribution with unknown mean
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Entropy loss and risk of improved estimators for the generalized variance and precision