Estimation, prediction and the Stein phenomenon under divergence loss
DOI10.1016/J.JMVA.2008.02.002zbMATH Open1274.62080OpenAlexW1963870319MaRDI QIDQ953855FDOQ953855
Authors: Malay Ghosh, V. Mergel', Gauri Sankar Datta
Publication date: 6 November 2008
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jmva.2008.02.002
Recommendations
- On the Stein phenomenon under divergence loss and an unknown variance-covariance matrix
- On the Stein effect under density power divergence loss
- Stein estimation -- a review
- Density prediction and the Stein phenomenon
- Stein-rule estimation under an extended balanced loss function
- Estimation of a covariance matrix under Stein's loss
- On predictive density estimation under \(\alpha\)-divergence loss
- Stein estimation for infinitely divisible laws
empirical BayesadmissibilityminimaxityKullback-Leibler lossBaranchick classBhattacharyya-Hellinger loss
Empirical decision procedures; empirical Bayes procedures (62C12) Minimax procedures in statistical decision theory (62C20) Admissibility in statistical decision theory (62C15)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Estimation with quadratic loss.
- Title not available (Why is that?)
- Title not available (Why is that?)
- Improved minimax predictive densities under Kullback-Leibler loss
- A shrinkage predictive distribution for multivariate normal observables
- On Minimax Statistical Decision Procedures and their Admissibility
- Differential geometry of curved exponential families. Curvatures and information loss
- Goodness of prediction fit
- Stein's Estimation Rule and Its Competitors--An Empirical Bayes Approach
- On the Admissibility of Invariant Estimators of One or More Location Parameters
- Proper Bayes Minimax Estimators of the Multivariate Normal Mean
- Inadmissibility of maximum likelihood estimators in some multiple regression problems with three or more independent variables
- A Generalized Bayes Rule for Prediction
- A statistical diptych: Admissible inferences -- recurrence of symmetric Markov chains
- A Family of Minimax Estimators of the Mean of a Multivariate Normal Distribution
- The Admissibility of Pitman's Estimator of a Single Location Parameter
- Admissible Estimators, Recurrent Diffusions, and Insoluble Boundary Value Problems
- Some Problems in Minimax Point Estimation
- Deriving posterior distributions for a location parameter: A decision theoretic approach
- Minimax estimation of location parameters for spherically symmetric distributions with concave loss
- Minimax Bayes estimators of a multivariate normal mean
- Admissiblity of procedures in two-dimensional location parameter problems
Cited In (25)
- Reference priors via \(\alpha \)-divergence for a certain non-regular model in the presence of a nuisance parameter
- On the Stein phenomenon under divergence loss and an unknown variance-covariance matrix
- On the Loss Robustness of Least-Square Estimators
- Admissible predictive density estimation
- Stein-rule estimation under an extended balanced loss function
- Stochastic domination in predictive density estimation for ordered normal means under \(\alpha\)-divergence loss
- Title not available (Why is that?)
- On the Stein effect under density power divergence loss
- Density prediction and the Stein phenomenon
- On predictive density estimation under \(\alpha\)-divergence loss
- Exact minimax estimation of the predictive density in sparse Gaussian models
- From minimax shrinkage estimation to minimax shrinkage prediction
- On discrete priors and sparse minimax optimal predictive densities
- Matrix variate density estimation with additional information
- Some Variants of Constrained Estimation in Finite Population Sampling
- The informational gain from Stein and hierarchial Stein estimators
- A Generalized Bayes Rule for Prediction
- Optimal shrinkage estimation of predictive densities under \(\alpha\)-divergences
- Pitman closeness domination in predictive density estimation for two-ordered normal means under \(\alpha \)-divergence loss
- On generalized moment identity and its applications: a unified approach
- On minimax optimality of sparse Bayes predictive density estimates
- Bayesian prediction of a density function in terms of \(e\)-mixture
- On predictive density estimation with additional information
- Hierarchical empirical Bayes estimation of two sample means under divergence loss
- Minimax estimation of the mean matrix of the matrix variate normal distribution under the divergence loss function
This page was built for publication: Estimation, prediction and the Stein phenomenon under divergence loss
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q953855)