A New Information Inequality and Its Application in Establishing Relation Among Various f-Divergence Measures
DOI10.2478/v10294-012-0002-6zbMath1277.62041OpenAlexW1985122280MaRDI QIDQ2867274
K. C. Jain, Ram Naresh Saraswat
Publication date: 11 December 2013
Published in: Journal of Applied Mathematics, Statistics and Informatics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.2478/v10294-012-0002-6
Kullback-Leibler distanceJensen-Shannon divergencechi-square divergence\(J\)-divergence\(f\)-divergence measureHellinger discriminationrelative \(J\)-divergencetriangular discrimination
Inequalities for sums, series and integrals (26D15) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items (5)
Cites Work
- Unnamed Item
- Unnamed Item
- Bounds for \(f\)-divergences under likelihood ratio constraints.
- Minimum Hellinger distance estimates for parametric models
- On the convexity of some divergence measures based on entropy functions
- Generalized arithmetic and geometric mean divergence measure and their statistical aspects
- Information radius
- On Information and Sufficiency
This page was built for publication: A New Information Inequality and Its Application in Establishing Relation Among Various f-Divergence Measures