Generalized Symmetric Divergence Measures and the Probability of Error
From MaRDI portal
Publication:4929217
DOI10.1080/03610926.2011.594542zbMath1355.94028arXiv1103.5218OpenAlexW1922504709MaRDI QIDQ4929217
Publication date: 13 June 2013
Published in: Communications in Statistics - Theory and Methods (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1103.5218
probability of errorJensen-Shannon divergence\(f\)-divergenceJ-divergencearithmetic-geometric divergence
Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Cites Work
- Some aspects of error bounds in feature selection
- Relative information of type \(s\), Csiszár's \(f\)-divergence, and information inequalities
- On the f-divergence and singularity of probability measures
- On the convexity of some divergence measures based on entropy functions
- Probability of Error, Expected Divergence, and the Affinity of Several Distributions
- Information radius
- On Information and Sufficiency
- An invariant form for the prior probability in estimation problems
This page was built for publication: Generalized Symmetric Divergence Measures and the Probability of Error