Mathematical Research Data Initiative
Main page
Recent changes
Random page
SPARQL
MaRDI@GitHub
New item
Special pages
In other projects
MaRDI portal item
Discussion
View source
View history
English
Log in

Relative entropy derivative bounds

From MaRDI portal
Publication:280464
Jump to:navigation, search

DOI10.3390/E15072861zbMATH Open1398.94080OpenAlexW2012326118MaRDI QIDQ280464FDOQ280464


Authors: Pablo Zegers, Alexis Fuentes, Carlos Alarcón Edit this on Wikidata


Publication date: 10 May 2016

Published in: Entropy (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.3390/e15072861




Recommendations

  • A correspondence principle for relative entropy minimization
  • Some upper bounds for relative entropy and applications
  • Learning Theory
  • On density estimation under relative entropy loss criterion
  • Consistency and generalization bounds for maximum entropy density estimation


zbMATH Keywords

Kullback-Leibler divergencerelative entropyFisher informationasymptotic equipartition principlemaximum log likelihoodShannon differential entropytypical set


Mathematics Subject Classification ID

Measures of information, entropy (94A17)


Cites Work

  • A Mathematical Theory of Communication
  • Title not available (Why is that?)
  • Analysis of signals in the Fisher-Shannon information plane


Uses Software

  • JPEG2000





This page was built for publication: Relative entropy derivative bounds

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q280464)

Retrieved from "https://portal.mardi4nfdi.de/w/index.php?title=Publication:280464&oldid=12163414"
Tools
What links here
Related changes
Printable version
Permanent link
Page information
This page was last edited on 30 January 2024, at 01:54. Warning: Page may not contain recent updates.
Privacy policy
About MaRDI portal
Disclaimers
Imprint
Powered by MediaWiki