From -entropy to KL-entropy: analysis of minimum information complexity density estima\-tion

From MaRDI portal
Publication:869967

DOI10.1214/009053606000000704zbMATH Open1106.62005arXivmath/0702653OpenAlexW2086333522MaRDI QIDQ869967FDOQ869967


Authors: Tong Zhang Edit this on Wikidata


Publication date: 12 March 2007

Published in: The Annals of Statistics (Search for Journal in Brave)

Abstract: We consider an extension of epsilon-entropy to a KL-divergence based complexity measure for randomized density estimation methods. Based on this extension, we develop a general information-theoretical inequality that measures the statistical complexity of some deterministic and randomized density estimators. Consequences of the new inequality will be presented. In particular, we show that this technique can lead to improvements of some classical results concerning the convergence of minimum description length and Bayesian posterior distributions. Moreover, we are able to derive clean finite-sample convergence bounds that are not obtainable using previous approaches.


Full work available at URL: https://arxiv.org/abs/math/0702653




Recommendations




Cites Work


Cited In (42)

Uses Software





This page was built for publication: From \(\varepsilon\)-entropy to KL-entropy: analysis of minimum information complexity density estima\-tion

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q869967)