Information radius

From MaRDI portal
Publication:5580827

DOI10.1007/BF00537520zbMath0186.53301OpenAlexW4252822198MaRDI QIDQ5580827

R. Sibson

Publication date: 1969

Published in: Zeitschrift für Wahrscheinlichkeitstheorie und verwandte Gebiete (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/bf00537520




Related Items (38)

Unnamed ItemGeneralized divergence measures and the probability of errorUnnamed ItemON SOME INEQUALITIES AND GENERALIZED ENTROPIES: A UNIFIED APPROACGeneralized divergence measures: Information matrices, amount of information, asymptotic distribution, and its applications to test statistical hypothesesDivergence statistics based on entropy functions and stratified samplingStrong converse exponent for classical-quantum channel codingNumerical taxonomy and the principle of maximum entropyThe Augustin capacity and centerCan generalised divergences help for invariant neural networks?On testing hypotheses with divergence statisticsBounds on the probability of error in terms of generalized information radiiInformational divergence and the dissimilarity of probability distributions\((R,S)\)-information radius of type \(t\) and comparison of experimentsA strong converse bound for multiple hypothesis testing, with applications to high-dimensional estimationLearning decision trees with taxonomy of propositionalized attributesA symmetric information divergence measure of the Csiszár's \(f\)-divergence class and its boundsMixedf-divergence and inequalities for log-concave functionsGeneralized Symmetric Divergence Measures and the Probability of ErrorRényi generalizations of the conditional quantum mutual informationGeneralized arithmetic and geometric mean divergence measure and their statistical aspectsGeneralized arithmetic and geometric mean divergence measure and their statistical aspectsStrong converse for the classical capacity of entanglement-breaking and Hadamard channels via a sandwiched Rényi relative entropyA class of measures of informativity of observation channelsUnnamed ItemUnnamed ItemGeneralized ‘useful’ non-symmetric divergence measures and inequalitiesMetric divergence measures and information value in credit scoringConnections of generalized divergence measures with Fisher information matrixA New Fuzzy Information Inequalities and its Applications in Establishing Relation among Fuzzy $f$-Divergence MeasuresSome statistical applications of generalized Jensen difference divergence measures for fuzzy information systemsCommon Information, Noise Stability, and Their ExtensionsA New Information Inequality and Its Application in Establishing Relation Among Various f-Divergence MeasuresTrigonometric entropies, Jensen difference divergence measures, and error boundsProperties of noncommutative Rényi and Augustin informationThe ø‐Entropy in the Selection of a Fixed Number of ExperimentsGeneralized Jensen difference divergence measures and Fisher measure of informationA primer on alpha-information theory with application to leakage in secrecy systems



Cites Work


This page was built for publication: Information radius