The following pages link to Information radius (Q5580827):
Displayed 38 items.
- Strong converse for the classical capacity of entanglement-breaking and Hadamard channels via a sandwiched Rényi relative entropy (Q742865) (← links)
- Bounds on the probability of error in terms of generalized information radii (Q912072) (← links)
- Learning decision trees with taxonomy of propositionalized attributes (Q955830) (← links)
- Trigonometric entropies, Jensen difference divergence measures, and error bounds (Q1069901) (← links)
- Numerical taxonomy and the principle of maximum entropy (Q1126398) (← links)
- Informational divergence and the dissimilarity of probability distributions (Q1164917) (← links)
- \((R,S)\)-information radius of type \(t\) and comparison of experiments (Q1184942) (← links)
- Connections of generalized divergence measures with Fisher information matrix (Q1310926) (← links)
- Some statistical applications of generalized Jensen difference divergence measures for fuzzy information systems (Q1311716) (← links)
- Generalized divergence measures and the probability of error (Q1338635) (← links)
- Generalized divergence measures: Information matrices, amount of information, asymptotic distribution, and its applications to test statistical hypotheses (Q1358818) (← links)
- Divergence statistics based on entropy functions and stratified sampling (Q1358834) (← links)
- A strong converse bound for multiple hypothesis testing, with applications to high-dimensional estimation (Q1746556) (← links)
- Properties of noncommutative Rényi and Augustin information (Q2113490) (← links)
- A primer on alpha-information theory with application to leakage in secrecy systems (Q2117885) (← links)
- The Augustin capacity and center (Q2190923) (← links)
- Metric divergence measures and information value in credit scoring (Q2337058) (← links)
- Strong converse exponent for classical-quantum channel coding (Q2408555) (← links)
- A symmetric information divergence measure of the Csiszár's \(f\)-divergence class and its bounds (Q2485436) (← links)
- A class of measures of informativity of observation channels (Q2556388) (← links)
- A New Information Inequality and Its Application in Establishing Relation Among Various f-Divergence Measures (Q2867274) (← links)
- ON SOME INEQUALITIES AND GENERALIZED ENTROPIES: A UNIFIED APPROAC (Q3728800) (← links)
- (Q3819824) (← links)
- (Q4039218) (← links)
- The ø‐Entropy in the Selection of a Fixed Number of Experiments (Q4246438) (← links)
- Generalized Jensen difference divergence measures and Fisher measure of information (Q4246460) (← links)
- On testing hypotheses with divergence statistics (Q4337136) (← links)
- (Q4614355) (← links)
- Generalized Symmetric Divergence Measures and the Probability of Error (Q4929217) (← links)
- Mixed<i>f</i>-divergence and inequalities for log-concave functions (Q5175027) (← links)
- Rényi generalizations of the conditional quantum mutual information (Q5178224) (← links)
- (Q5210473) (← links)
- Generalized ‘useful’ non-symmetric divergence measures and inequalities (Q5244222) (← links)
- Generalized arithmetic and geometric mean divergence measure and their statistical aspects (Q5441824) (← links)
- Generalized arithmetic and geometric mean divergence measure and their statistical aspects (Q5756400) (← links)
- A New Fuzzy Information Inequalities and its Applications in Establishing Relation among Fuzzy $f$-Divergence Measures (Q5863532) (← links)
- Common Information, Noise Stability, and Their Extensions (Q5863763) (← links)
- Can generalised divergences help for invariant neural networks? (Q6178829) (← links)