Escort distributions minimizing the Kullback-Leibler divergence for a large deviations principle and tests of entropy level
DOI10.1007/s10463-014-0501-xzbMath1440.62033OpenAlexW2059046815MaRDI QIDQ263270
Valérie Girardin, Philippe Regnault
Publication date: 4 April 2016
Published in: Annals of the Institute of Statistical Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10463-014-0501-x
estimationlarge deviations principleKullback-Leibler divergenceinformation geometryShannon entropyescort distributionstests
Large deviations (60F10) Information theory (general) (94A15) Statistical aspects of information-theoretic topics (62B10)
Related Items (5)
Uses Software
Cites Work
- A Mathematical Theory of Communication
- Tsallis distribution as a standard maximum entropy solution with `tail' constraint
- An informational divergence geometry for stochastic matrices
- I-divergence geometry of probability distributions and minimization problems
- Dynamical sources in information theory: Fundamental intervals and word prefixes
- Introduction to Nonextensive Statistical Mechanics
- Limit Distributions for a Statistical Estimate of the Entropy
- Asymptotic Statistics
- Large deviations for empirical entropies ofg-measures
- On Information and Sufficiency
- Entropy-based goodness-of-fit tests—a unifying framework: Application to DNA replication
- Information Theory and Statistics: A Tutorial
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Escort distributions minimizing the Kullback-Leibler divergence for a large deviations principle and tests of entropy level