Correction: A class of Rényi information estimators for multidimensional densities
From MaRDI portal
Publication:620569
DOI10.1214/10-AOS773zbMATH Open1205.94052MaRDI QIDQ620569FDOQ620569
Authors: Luc Pronzato, Nikolai N. Leonenko
Publication date: 19 January 2011
Published in: The Annals of Statistics (Search for Journal in Brave)
Recommendations
- A class of Rényi information estimators for multidimensional densities
- Corrections to “Wyner’s Common Information Under Rényi Divergence Measures” [May 18 3616-3632]
- Information theoretic criteria in non-parametric density estimation. Bias and variance in the infinite dimensional case
- Statistical inference for the \(\epsilon \)-entropy and the quadratic Rényi entropy
- Rényi information, loglikelihood and an intrinsic distribution measure
- Correction to: Entropy and the consistent estimation of joint distributions
- Statistical inference for Rényi entropy functionals
- On the Riesz estimation of multivariate probability density functions
entropy estimationTsallis entropyestimation of divergenceestimation of statistical distancenearest-neighbor distancesRényi entropyHavrda-Charvát entropy
Cites Work
Cited In (5)
- Entropy-based test for generalised Gaussian distributions
- Bias reduction and metric learning for nearest-neighbor estimation of Kullback-Leibler divergence
- Title not available (Why is that?)
- \(K\)-nearest neighbor based consistent entropy estimation for hyperspherical distributions
- On the Kozachenko-Leonenko entropy estimator
This page was built for publication: Correction: A class of Rényi information estimators for multidimensional densities
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q620569)