A NEW DIRECTED DIVERGENCE MEASURE AND ITS CHARACTERIZATION∗
From MaRDI portal
Publication:3483213
DOI10.1080/03081079008935097zbMath0703.94003MaRDI QIDQ3483213
Publication date: 1990
Published in: International Journal of General Systems (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03081079008935097
entropy; finiteness; semiboundedness; probability distributions; divergence measure; variational distance; nonnegativity; distance measures; Shannon entropy function
94A17: Measures of information, entropy
Related Items
Unnamed Item, A review of tests for exponentiality with Monte Carlo comparisons, Lin–Wong divergence and relations on type I censored data, Goodness of fit test using Lin-Wong divergence based on Type-I censored data, Some properties of Lin–Wong divergence on the past lifetime data, Convergence of Monte Carlo distribution estimates from rival samplers, The Pólya information divergence, Approximation of the integral mean divergence and \(f\)-divergence via mean results, The limiting conditional probability distribution in a stochastic model of T cell repertoire maintenance, Hermite-Hadamard trapezoid and mid-point divergences, A Survey of Reverse Inequalities for f-Divergence Measure in Information Theory
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Diversity and dissimilarity coefficients: A unified approach
- Cross entropy, dissimilarity measures, and characterizations of quadratic entropy
- Axiomatic characterization of the directed divergences and their linear combinations
- A Decision Theory Approach to the Approximation of Discrete Probability Densities
- Sharper lower bounds for discrimination information in terms of variation (Corresp.)
- Recursive density types and Nerode extensions of arithmetic
- On the best finite set of linear observables for discriminating two Gaussian signals
- Approximating discrete probability distributions with dependence trees
- Note on discrimination information and variation (Corresp.)
- The Reliability of Linear Feature Extractors
- On Information and Sufficiency