Mathematical Research Data Initiative
Main page
Recent changes
Random page
SPARQL
MaRDI@GitHub
New item
In other projects
MaRDI portal item
Discussion
View source
View history
English
Log in

Universal estimation of information measures for analog sources

From MaRDI portal
Publication:3589014
Jump to:navigation, search

DOI10.1561/0100000021zbMATH Open1194.94174OpenAlexW2095324495MaRDI QIDQ3589014FDOQ3589014

Sergio Verdú, Sanjeev R. Kulkarni, Qing Wang

Publication date: 10 September 2010

Published in: Foundations and Trends™ in Communications and Information Theory (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1561/0100000021




Mathematics Subject Classification ID

Measures of information, entropy (94A17)



Cited In (2)

  • Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence
  • Optimal rates of entropy estimation over Lipschitz balls


   Recommendations
  • Universal Divergence Estimation for Finite-Alphabet Sources 👍 👎
  • The interplay between information and estimation measures 👍 👎
  • Title not available (Why is that?) 👍 👎
  • Divergence Estimation of Continuous Distributions Based on Data-Dependent Partitions 👍 👎
  • Title not available (Why is that?) 👍 👎





This page was built for publication: Universal estimation of information measures for analog sources

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3589014)

Retrieved from "https://portal.mardi4nfdi.de/w/index.php?title=Publication:3589014&oldid=16999967"
Tools
What links here
Related changes
Printable version
Permanent link
Page information
This page was last edited on 5 February 2024, at 03:14. Warning: Page may not contain recent updates.
Privacy policy
About MaRDI portal
Disclaimers
Imprint
Powered by MediaWiki