Universal estimation of information measures for analog sources
From MaRDI portal
Publication:3589014
Recommendations
Cited in
(5)- Divergence Estimation of Continuous Distributions Based on Data-Dependent Partitions
- Universal Divergence Estimation for Finite-Alphabet Sources
- The interplay between information and estimation measures
- Bias reduction and metric learning for nearest-neighbor estimation of Kullback-Leibler divergence
- Optimal rates of entropy estimation over Lipschitz balls
This page was built for publication: Universal estimation of information measures for analog sources
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3589014)