On estimating a density using Hellinger distance and some other strange facts (Q2266536): Difference between revisions
From MaRDI portal
Latest revision as of 16:18, 14 June 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | On estimating a density using Hellinger distance and some other strange facts |
scientific article |
Statements
On estimating a density using Hellinger distance and some other strange facts (English)
0 references
1986
0 references
We shall investigate here the rates of convergence of estimators for various classes of densities with compact support on \({\mathbb{R}}^ n\), when the loss function is measured by some distance between the density and its estimation, mainly \({\mathbb{L}}^ 1\) or Hellinger. We shall develop the connection existing between the rate of convergence and the metric entropy of the class, using classical results from approximation theory and also explain the differences that occur between \({\mathbb{L}}^ 1\) and Hellinger. Then we shall give methods for computing lower bounds. If the compactness restriction on the support is not satisfied, the dimension of the class becomes infinite and we shall use the preceding methods to show that, in this case, there does not exist any rate of convergence. Even if one can find a sequence of consistent estimators, their rate of convergence will be arbitrarily slow for some densities.
0 references
density estimation
0 references
minimax risk
0 references
Hellinger distance
0 references
rates of convergence of estimators
0 references
densities with compact support
0 references
loss function
0 references
metric entropy
0 references
methods for computing lower bounds
0 references
0 references
0 references