Density-free convergence properties of various estimators of entropy (Q1095528): Difference between revisions
From MaRDI portal
Created a new Item |
Added link to MaRDI item. |
||
links / mardi / name | links / mardi / name | ||
Revision as of 01:25, 31 January 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Density-free convergence properties of various estimators of entropy |
scientific article |
Statements
Density-free convergence properties of various estimators of entropy (English)
0 references
1987
0 references
Let \(f(x)\) be a probability density function, \(x\in\mathbb R^d\). The Shannon (or differential) entropy is defined as \[ H(f)=-\int f(x)\log f(x)\,dx. \] In this paper we propose, based on a random sample \(X_1,\dots,X_n\) generated from \(f\), two new nonparametric estimators for \(H(f)\). Both entropy estimators are histogram-based in the sense that they involve a histogram-based density estimator \(\hat f_n\). We prove their a.s. consistency with the only condition on \(f\) that \(H(f)\) is finite.
0 references
density-free convergence properties
0 references
differential entropy
0 references
almost sure convergence
0 references
\(L_1\)-convergence
0 references
Shannon entropy
0 references
entropy estimators
0 references
histogram-based
0 references
histogram-based density estimator
0 references
consistency
0 references