Information radius
From MaRDI portal
Publication:5580827
DOI10.1007/BF00537520zbMath0186.53301OpenAlexW4252822198MaRDI QIDQ5580827
Publication date: 1969
Published in: Zeitschrift für Wahrscheinlichkeitstheorie und verwandte Gebiete (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf00537520
Related Items (38)
Unnamed Item ⋮ Generalized divergence measures and the probability of error ⋮ Unnamed Item ⋮ ON SOME INEQUALITIES AND GENERALIZED ENTROPIES: A UNIFIED APPROAC ⋮ Generalized divergence measures: Information matrices, amount of information, asymptotic distribution, and its applications to test statistical hypotheses ⋮ Divergence statistics based on entropy functions and stratified sampling ⋮ Strong converse exponent for classical-quantum channel coding ⋮ Numerical taxonomy and the principle of maximum entropy ⋮ The Augustin capacity and center ⋮ Can generalised divergences help for invariant neural networks? ⋮ On testing hypotheses with divergence statistics ⋮ Bounds on the probability of error in terms of generalized information radii ⋮ Informational divergence and the dissimilarity of probability distributions ⋮ \((R,S)\)-information radius of type \(t\) and comparison of experiments ⋮ A strong converse bound for multiple hypothesis testing, with applications to high-dimensional estimation ⋮ Learning decision trees with taxonomy of propositionalized attributes ⋮ A symmetric information divergence measure of the Csiszár's \(f\)-divergence class and its bounds ⋮ Mixedf-divergence and inequalities for log-concave functions ⋮ Generalized Symmetric Divergence Measures and the Probability of Error ⋮ Rényi generalizations of the conditional quantum mutual information ⋮ Generalized arithmetic and geometric mean divergence measure and their statistical aspects ⋮ Generalized arithmetic and geometric mean divergence measure and their statistical aspects ⋮ Strong converse for the classical capacity of entanglement-breaking and Hadamard channels via a sandwiched Rényi relative entropy ⋮ A class of measures of informativity of observation channels ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Generalized ‘useful’ non-symmetric divergence measures and inequalities ⋮ Metric divergence measures and information value in credit scoring ⋮ Connections of generalized divergence measures with Fisher information matrix ⋮ A New Fuzzy Information Inequalities and its Applications in Establishing Relation among Fuzzy $f$-Divergence Measures ⋮ Some statistical applications of generalized Jensen difference divergence measures for fuzzy information systems ⋮ Common Information, Noise Stability, and Their Extensions ⋮ A New Information Inequality and Its Application in Establishing Relation Among Various f-Divergence Measures ⋮ Trigonometric entropies, Jensen difference divergence measures, and error bounds ⋮ Properties of noncommutative Rényi and Augustin information ⋮ The ø‐Entropy in the Selection of a Fixed Number of Experiments ⋮ Generalized Jensen difference divergence measures and Fisher measure of information ⋮ A primer on alpha-information theory with application to leakage in secrecy systems
Cites Work
This page was built for publication: Information radius