Moment information and entropy evaluation for probability densities (Q426654)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Moment information and entropy evaluation for probability densities |
scientific article |
Statements
Moment information and entropy evaluation for probability densities (English)
0 references
11 June 2012
0 references
The authors address the following problem: How to compute the entropy of an unknown probability density \(f\) of a random variable taking values in \([0,\,1]\) from the knowledge of its moments? The difficulty lies in the fact that the \(L_2\) distance between the powers \(x^n\) decreases as the exponents increase, which makes the moment problem numerically unstable. They argue that using maximum entropy arguments, that is, if \(f_N\) is the density reconstructed using the maximum entropy method, then its entropy tends to that of \(f\). This fact can be used to choose an optimal number of moments for the determination of \(f_N\) and to determine the entropy of \(f\) from the knowledge of its moments.
0 references
entropy convergence
0 references
Hausdorff moment problem
0 references
Kullback-Leibler distance
0 references
maximum entropy
0 references
0 references
0 references