\(K\)-nearest neighbor based consistent entropy estimation for hyperspherical distributions (Q657553): Difference between revisions

From MaRDI portal
Created claim: Wikidata QID (P12): Q23921331, #quickstatements; #temporary_batch_1711015421434
ReferenceBot (talk | contribs)
Changed an Item
 
Property / cites work
 
Property / cites work: Multivariate k-nearest neighbor density estimates / rank
 
Normal rank
Property / cites work
 
Property / cites work: Correction: A class of Rényi information estimators for multidimensional densities / rank
 
Normal rank
Property / cites work
 
Property / cites work: \(k_n\)-nearest neighbor estimators of entropy / rank
 
Normal rank
Property / cites work
 
Property / cites work: Best asymptotic normality of the kernel density entropy estimator for smooth densities / rank
 
Normal rank
Property / cites work
 
Property / cites work: Nearest neighbor estimates of entropy for multivariate circular distributions / rank
 
Normal rank
Property / cites work
 
Property / cites work: ESTIMATION OF MULTIVARIATE SHANNON ENTROPY USING MOMENTS / rank
 
Normal rank
Property / cites work
 
Property / cites work: An extension of the von mises distribution / rank
 
Normal rank
Property / cites work
 
Property / cites work: The generalized von Mises distribution / rank
 
Normal rank
Property / cites work
 
Property / cites work: Divergence Estimation for Multidimensional Densities Via $k$-Nearest-Neighbor Distances / rank
 
Normal rank

Latest revision as of 19:15, 4 July 2024

scientific article
Language Label Description Also known as
English
\(K\)-nearest neighbor based consistent entropy estimation for hyperspherical distributions
scientific article

    Statements

    \(K\)-nearest neighbor based consistent entropy estimation for hyperspherical distributions (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    9 January 2012
    0 references
    Summary: A consistent entropy estimator for hyperspherical data is proposed based on the \(k\)-nearest neighbor (knn) approach. The asymptotic unbiasedness and consistency of the estimator are proved. Moreover, cross entropy and Kullback-Leibler (KL) divergence estimators are also discussed. Simulation studies are conducted to assess the performance of the estimators for models including uniform and von Mises-Fisher distributions. The proposed knn entropy estimator is compared with the moment based counterpart via simulations. The results show that these two methods are comparable.
    0 references
    hyperspherical distribution
    0 references
    directional data
    0 references
    differential entropy
    0 references
    cross entropy
    0 references
    Kullback-Leibler divergence
    0 references
    k-nearest neighbor
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references