f-entropies, probability of error, and feature selection

From MaRDI portal
Publication:4177461

DOI10.1016/S0019-9958(78)90587-9zbMath0394.94011MaRDI QIDQ4177461

Moshe Ben-Bassat

Publication date: 1978

Published in: Information and Control (Search for Journal in Brave)




Related Items

Unnamed ItemInformation of degree \(\beta\) and probability of errorUnnamed ItemOrdering and selecting extreme populations by means of entropies and divergencesA generalization of HH \(f\)-divergenceON SOME INEQUALITIES AND GENERALIZED ENTROPIES: A UNIFIED APPROACInequalities of the Jensen and Edmundson-Lah-Ribarič type for 3-convex functions with applicationsReverses of the Jensen inequality in terms of first derivative and applicationsOn the Probability of Error in Fuzzy Discrimination ProblemsGeneral Lebesgue integral inequalities of Jensen and Ostrowski type for differentiable functions whose derivatives in absolute value are h-convex and applicationsOn Block-Iterative Entropy MaximizationExtremal Characteristics of Statistical Criteria with Given Total Variation Distances between HypothesesBounds on the probability of error in terms of generalized information radiiFormulas for Rényi information and related measures for univariate distributions.Some bounds on probability of error in fuzzy discrimination problemsInequalities of the Edmundson-Lah-Ribarič type for \(n\)-convex functions with applicationsAn \((R',S')\)-norm fuzzy relative information measure and its applications in strategic decision-makingSOME REVERSES OF THE JENSEN INEQUALITY WITH APPLICATIONSRényi information, loglikelihood and an intrinsic distribution measureA Survey of Reverse Inequalities for f-Divergence Measure in Information TheoryOn Rényi information for ergodic diffusion processesUnnamed ItemOptimization of Burg's entropy over linear constraintsUnnamed ItemUnnamed ItemUnnamed ItemTrigonometric entropies, Jensen difference divergence measures, and error bounds