\(K\)-nearest neighbor based consistent entropy estimation for hyperspherical distributions (Q657553): Difference between revisions
From MaRDI portal
Set profile property. |
Set OpenAlex properties. |
||
Property / full work available at URL | |||
Property / full work available at URL: https://doi.org/10.3390/e13030650 / rank | |||
Normal rank | |||
Property / OpenAlex ID | |||
Property / OpenAlex ID: W2051750307 / rank | |||
Normal rank |
Revision as of 21:49, 19 March 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | \(K\)-nearest neighbor based consistent entropy estimation for hyperspherical distributions |
scientific article |
Statements
\(K\)-nearest neighbor based consistent entropy estimation for hyperspherical distributions (English)
0 references
9 January 2012
0 references
Summary: A consistent entropy estimator for hyperspherical data is proposed based on the \(k\)-nearest neighbor (knn) approach. The asymptotic unbiasedness and consistency of the estimator are proved. Moreover, cross entropy and Kullback-Leibler (KL) divergence estimators are also discussed. Simulation studies are conducted to assess the performance of the estimators for models including uniform and von Mises-Fisher distributions. The proposed knn entropy estimator is compared with the moment based counterpart via simulations. The results show that these two methods are comparable.
0 references
hyperspherical distribution
0 references
directional data
0 references
differential entropy
0 references
cross entropy
0 references
Kullback-Leibler divergence
0 references
k-nearest neighbor
0 references