An explicit formula for the risk of James-Stein estimators
From MaRDI portal
Publication:3969726
Cites work
- scientific article; zbMATH DE number 3122730 (Why is no real title available?)
- Estimating the Mean of a Multivariate Normal Population with General Quadratic Loss Function
- Estimation with quadratic loss.
- Families of minimax estimators of the mean of a multivariate normal distribution
- Minimax estimation of location parameters for spherically symmetric distributions with concave loss
- Stein's positive part estimator and Bayes estimator
Cited in
(14)- A simple form for the inverse moments of non-central \(\chi ^ 2\) and F random variables and certain confluent hypergeometric functions
- A class of multiple shrinkage estimators
- Recurrence relations for noncentral density, distribution functions and inverse moments
- On the non-stochastic ordering of some quadratic forms
- The relationship between moments of truncated and original distributions plus some other simple structural properties of weighted distributions
- On sharper bounds for the risk of james-stein estimators
- An Empirical Bayes Stein-Type Estimator for Regression Parameters Under Linear Constraints
- On the inevitability of a paradox in shrinkage estimation for scale mixtures of normals.
- An exact formula for the mean squared error of the inverse estimator in the linear calibration problem
- Simultaneous estimation of the multivariate normal mean under balanced loss function
- James-stein estimation with constraints on the norm
- On moments of beta mixtures,the noncentral beta distribution,and the coefficient of determination
- Effect of Sample Size on the Size of the Coefficient of Determination in Simple Linear Regression
- An Explicit Formula for the Risk of the Positive-Part James-Stein Estimator
This page was built for publication: An explicit formula for the risk of James-Stein estimators
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3969726)