Hierarchical empirical Bayes estimation of two sample means under divergence loss (Q2316969): Difference between revisions

From MaRDI portal
Set OpenAlex properties.
ReferenceBot (talk | contribs)
Changed an Item
 
Property / cites work
 
Property / cites work: Differential geometry of curved exponential families. Curvatures and information loss / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3687500 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Families of minimax estimators of the mean of a multivariate normal distribution / rank
 
Normal rank
Property / cites work
 
Property / cites work: Improved minimax predictive densities under Kullback-Leibler loss / rank
 
Normal rank
Property / cites work
 
Property / cites work: Empirical and hierarchical Bayes competitors of preliminary test estimators in two sample problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Estimation, prediction and the Stein phenomenon under divergence loss / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Stein phenomenon under divergence loss and an unknown variance-covariance matrix / rank
 
Normal rank
Property / cites work
 
Property / cites work: A shrinkage predictive distribution for multivariate Normal observables / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3292859 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Estimation of the mean of a multivariate normal distribution / rank
 
Normal rank

Latest revision as of 02:12, 20 July 2024

scientific article
Language Label Description Also known as
English
Hierarchical empirical Bayes estimation of two sample means under divergence loss
scientific article

    Statements

    Hierarchical empirical Bayes estimation of two sample means under divergence loss (English)
    0 references
    0 references
    0 references
    7 August 2019
    0 references
    The paper deals with simultaneous estimation of two mean vectors when one suspects that the two are equal. The results are obtained using a general divergence loss, which measures the distance between two densities rather than the distance between two parameters, and thus is more intrinsic in nature. The paper proposes hierarchical empirical Bayes estimators in contrast to preliminary test estimators, and can provide dominance over the individual means.
    0 references
    dominance property
    0 references
    Hellinger divergence
    0 references
    Kullback-Leibler divergence
    0 references
    minimaxity
    0 references
    risk function
    0 references
    shrinkage estimator
    0 references
    simultaneous estimation
    0 references
    Stein phenomenon
    0 references

    Identifiers