Escort distributions minimizing the Kullback-Leibler divergence for a large deviations principle and tests of entropy level
From MaRDI portal
(Redirected from Publication:263270)
Recommendations
- Chi-square divergence and minimization problem
- scientific article; zbMATH DE number 3896069
- scientific article; zbMATH DE number 1104465
- Uses of entropy and divergence measures for evaluating econometric approximations and infer\-ence.
- Minimization of the Kullback-Leibler divergence over a log-normal exponential arc
Cites work
- scientific article; zbMATH DE number 3903723 (Why is no real title available?)
- scientific article; zbMATH DE number 107482 (Why is no real title available?)
- scientific article; zbMATH DE number 3518103 (Why is no real title available?)
- scientific article; zbMATH DE number 3568534 (Why is no real title available?)
- scientific article; zbMATH DE number 681023 (Why is no real title available?)
- scientific article; zbMATH DE number 1158743 (Why is no real title available?)
- scientific article; zbMATH DE number 1560711 (Why is no real title available?)
- scientific article; zbMATH DE number 819510 (Why is no real title available?)
- scientific article; zbMATH DE number 3302017 (Why is no real title available?)
- A Mathematical Theory of Communication
- An informational divergence geometry for stochastic matrices
- Asymptotic Statistics
- Dynamical sources in information theory: Fundamental intervals and word prefixes
- Entropy-based goodness-of-fit tests -- a unifying framework: application to DNA replication
- I-divergence geometry of probability distributions and minimization problems
- Information Theory and Statistics: A Tutorial
- Introduction to Nonextensive Statistical Mechanics
- Large deviations for empirical entropies ofg-measures
- Limit Distributions for a Statistical Estimate of the Entropy
- On Information and Sufficiency
- Probability and introduction to statistics. Course and corrected exercises. Bachelor 3. Schools of engineering, CAPES and ``Agrégation mathématiques examinations
- Tsallis distribution as a standard maximum entropy solution with `tail' constraint
Cited in
(7)- Minimization of the Kullback-Leibler divergence over a log-normal exponential arc
- Decision theory and large deviations for dynamical hypotheses tests: the Neyman-Pearson lemma, min-max and Bayesian tests
- scientific article; zbMATH DE number 1966630 (Why is no real title available?)
- Shannon's Entropy and Its Generalisations Towards Statistical Inference in Last Seven Decades
- Bayes posterior convergence for loss functions via almost additive thermodynamic formalism
- Different closed-form expressions for generalized entropy rates of Markov chains
- The conditional average entropies
This page was built for publication: Escort distributions minimizing the Kullback-Leibler divergence for a large deviations principle and tests of entropy level
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q263270)