On general maximum likelihood empirical Bayes estimation of heteroscedastic IID normal means (Q2188477)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | On general maximum likelihood empirical Bayes estimation of heteroscedastic IID normal means |
scientific article |
Statements
On general maximum likelihood empirical Bayes estimation of heteroscedastic IID normal means (English)
0 references
11 June 2020
0 references
The author considers empirical Bayes for heteroscedastic data: \[ X_{i} | \left(\theta_{i},\sigma_{i}^{2} \right) \stackrel{\mathrm{ind}}{\sim} N\left(\theta_{i},\sigma_{i}^{2} \right), \quad i=1,\dots,n, \] where \(\sigma_{i}^{2}\) are known. He uses a general maximum likelihood empirical Bayes (GMLEB) method for the estimation of \(\mathbf{\theta} =\left(\theta_{1},\dots,\theta_{n} \right) \) under the average squared loss function. From the author's abstract: The idea is to plug the generalized maximum likelihood estimator in the oracle Bayes rule. From the point of view of restricted empirical Bayes, the general empirical Bayes aims at a benchmark risk smaller than the linear empirical Bayes methods when the unknown means are i.i.d. variables. We prove an oracle inequality which states that under mild conditions, the regret of the GMLEB is of smaller order than \((\log n)^{5}/n\). The proof is based on a large deviation inequality for the generalized maximum likelihood estimator. The oracle inequality leads to the property that the GMLEB is adaptive minimax in \(L_{p}\) balls when the order of the norm of the ball is larger than \(\left( (\log n)^{5/2}/\sqrt{n}\right)^{1/(p^2)}\). We demonstrate the superb risk performance of the GMLEB through simulation experiments.
0 references
empirical Bayes
0 references
heteroscedasticity
0 references
generalized maximum likelihood estimator (MLE)
0 references
adaptive minimaxity
0 references
0 references
0 references