On divergences of finite measures and their applicability in statistics and information theory
From MaRDI portal
Publication:5400840
DOI10.1080/02331880902986919zbMath1282.62013OpenAlexW2165783665MaRDI QIDQ5400840
Publication date: 12 March 2014
Published in: Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/02331880902986919
differential power entropiesdivergences of \(\sigma\)-finite measuresdivergences of finite measureslocal and global divergences of finite measuresPinsker's inequality for finite measuresstatistical censoring
Censored data models (62N01) Bayesian problems; characterization of Bayes procedures (62C10) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items
On local divergences between two probability measures, Continuity of f-projections and applications to the iterative proportional fitting procedure, On testing local hypotheses via local divergence, Some Universal Insights on Divergences for Statistics, Machine Learning and Artificial Intelligence, Unnamed Item, On divergence tests for composite hypotheses under composite likelihood
Cites Work
- Unnamed Item
- Unnamed Item
- Optimal statistical decisions about some alternative financial models
- Strong consistency of the MLE under random censoring
- A generalization of Ornstein's \(\overline d\) distance with applications to information theory
- On the geometry of metric measure spaces. I
- On the geometry of metric measure spaces. II
- The sequential probability ratio test under random censorship
- Speech coding based upon vector quantization
- On Divergences and Informations in Statistics and Information Theory
- Linear Prediction of Speech
- Rényi Statistics in Directed Families of Exponential Experiments*
- Refinements of Pinsker's inequality
- Measurement of Diversity
- Fair group decisions in investment planning
- Information in quantal response data and random censoring