Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances (Q1731757): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
Added link to MaRDI item.
links / mardi / namelinks / mardi / name
 

Revision as of 07:43, 1 February 2024

scientific article
Language Label Description Also known as
English
Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances
scientific article

    Statements

    Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances (English)
    0 references
    0 references
    0 references
    0 references
    14 March 2019
    0 references
    The main object of analysis of this paper is a generalization of the so-called Kozachenko-Leonenko (K,L) estimator of entropy \(H\) [\textit{L. F. Kozachenko} and \textit{N. N. Leonenko}, Probl. Inf. Transm. 23, No. 1--2, 95--101 (1987; Zbl 0633.62005); translation from Probl. Peredachi Inf. 23, No. 2, 9--16 (1987)]: \[ H_n=\frac{1}{n}\sum^n_{i=1} \log\left(\rho^d_{(k)}\frac{V_d(n-1)}{e^{\Psi(k)}}\right), \] where \(V_d:=\pi^{d/2}/\Gamma(1+d/2)\) denotes the volume of the \(d\)-dimensional Euclidean unit ball and \(\Psi\) denotes the digamma function. This generalization is formed as a weighted average of the (K,L) estimator for different values of \(k\), where the weights are chosen to try to cancel the dominant bias terms.
    0 references
    0 references
    Kozachenko-Leonenko estimator
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references