On the convexity of some divergence measures based on entropy functions
From MaRDI portal
Publication:3937278
DOI10.1109/TIT.1982.1056497zbMath0479.94009OpenAlexW2070134780MaRDI QIDQ3937278
Jacob Burbea, C. Radhakrishna Rao
Publication date: 1982
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/tit.1982.1056497
Measures of information, entropy (94A17) Convex sets in (n) dimensions (including convex hypersurfaces) (52A20)
Related Items (96)
On choosing a goodness‐of‐fit test for discrete multivariate data ⋮ Evolution of the global inequality in greenhouse gases emissions using multidimensional generalized entropy measures ⋮ Asymptotic approximations for the distributions of the \(K\phi\)-divergence goodness-of-fit statistics ⋮ Unnamed Item ⋮ Likelihood Divergence Statistics for Testing Hypotheses in Familial Data ⋮ RAO'S STATISTIC FOR THE ANALYSIS OF UNIFORM ASSOCIATION IN CROSS-CLASSIFICATIONS ⋮ Analysis of symbolic sequences using the Jensen-Shannon divergence ⋮ Ordering and selecting extreme populations by means of entropies and divergences ⋮ Statistical aspects of divergence measures ⋮ New bounds for Shannon, relative and Mandelbrot entropies via Hermite interpolating polynomial ⋮ Approximations of Jensen divergence for twice differentiable functions ⋮ Extension of some results for channel capacity using a generalized information measure ⋮ Bounds for \(f\)-divergences under likelihood ratio constraints. ⋮ Rényi information measure for a used item ⋮ ON SOME INEQUALITIES AND GENERALIZED ENTROPIES: A UNIFIED APPROAC ⋮ On the parameter space derived from the joint probability density functions and the property of its scalar curvature ⋮ Divergence statistics based on entropy functions and stratified sampling ⋮ A family of generalized quantum entropies: definition and properties ⋮ Generalized divergences from generalized entropies ⋮ Reducing complexity in polygonal meshes with view-based saliency ⋮ A comparison of some estimators of the mixture proportion of mixed normal distributions ⋮ Entropy differential metric, distance and divergence measures in probability spaces: A unified approach ⋮ A brief biography and appreciation of Calyampudi Radhakrishna Rao, with a bibliography of his books and papers ⋮ Metrics induced by Jensen-Shannon and related divergences on positive definite matrices ⋮ New converses of Jensen inequality via Green functions with applications ⋮ Reverses of the Jensen inequality in terms of first derivative and applications ⋮ Jensen-Renyi's-Tsallis fuzzy divergence information measure with its applications ⋮ Skew Jensen-Bregman Voronoi Diagrams ⋮ Tsallis mutual information for document classification ⋮ General Lebesgue integral inequalities of Jensen and Ostrowski type for differentiable functions whose derivatives in absolute value are h-convex and applications ⋮ Shannon's Entropy and Its Generalisations Towards Statistical Inference in Last Seven Decades ⋮ Simple comparison of atomic population and shape atomic populations distributions between two molecular structures with a coherent number of atoms ⋮ Divergence-based tests for the bivariate gamma distribution applied to polarimetric synthetic aperture radar ⋮ Hermite-Hadamard type inequalities with applications ⋮ Extended fractional cumulative past and paired \(\phi\)-entropy measures ⋮ Quadratic entropy and analysis of diversity ⋮ Rényi statistics for testing hypotheses in mixed linear regression models ⋮ On testing hypotheses with divergence statistics ⋮ Bounds on the probability of error in terms of generalized information radii ⋮ Generalization of some bounds containing entropies on time scales ⋮ Unnamed Item ⋮ Nested models for categorical data ⋮ Formulas for Rényi information and related measures for univariate distributions. ⋮ Income distributions and decomposable divergence measures ⋮ New estimates for Csiszár divergence and Zipf-Mandelbrot entropy via Jensen-Mercer's inequality ⋮ Chernoff distance for truncated distributions ⋮ Certainty equivalents and information measures: Duality and extremal principles ⋮ Burbea-Rao divergence based statistics for testing uniform association ⋮ \((R,S)\)-information radius of type \(t\) and comparison of experiments ⋮ The Jensen-Shannon divergence ⋮ Gamma process-based models for disease progression ⋮ Learning decision trees with taxonomy of propositionalized attributes ⋮ SOME REVERSES OF THE JENSEN INEQUALITY WITH APPLICATIONS ⋮ On the \(J\)-divergence of intuitionistic fuzzy sets with its application to pattern recognition ⋮ A generalized model for the analysis of association in ordinal contingency tables ⋮ Expressions for Rényi and Shannon entropies for bivariate distributions ⋮ A symmetric information divergence measure of the Csiszár's \(f\)-divergence class and its bounds ⋮ Minimum \(K_\phi\)-divergence estimator. ⋮ Minimum \(K_\phi\)-divergence estimators for multinomial models and applications ⋮ Expressions for Rényi and Shannon entropies for multivariate distributions ⋮ Generalized Symmetric Divergence Measures and the Probability of Error ⋮ Estimating Rao's statistic distribution for testing uniform association in cross-classifications ⋮ Generalized arithmetic and geometric mean divergence measure and their statistical aspects ⋮ The \(K_{\varphi}\)-divergence statistic for categorical data problems ⋮ Entropy measures associated with j-divergences ⋮ Generalized arithmetic and geometric mean divergence measure and their statistical aspects ⋮ A Survey of Reverse Inequalities for f-Divergence Measure in Information Theory ⋮ Chernoff distance for conditionally specified models ⋮ Spectral dimensionality reduction for Bregman information ⋮ Rao's statistic for constant and proportional hazard models ⋮ Converse to the Sherman inequality with applications ⋮ Testing stationary distributions of Markov chains based on Rao's divergence ⋮ \((h,\Psi)\)-entropy differential metric ⋮ ON PARAMETRIZED HERMITE-HADAMARD TYPE INEQUALITIES ⋮ Improving the accuracy of goodness-of-fit tests based on Rao's divergence with small sample size. ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Unnamed Item ⋮ On Burbea-Rao divergence based goodness-of-fit tests for multinomial models ⋮ Goodness-of-fit tests for stationary distributions of Markov chains based on Rao's divergence ⋮ Hermite-Hadamard trapezoid and mid-point divergences ⋮ Metric divergence measures and information value in credit scoring ⋮ Hierarchical clustering based on the information bottleneck method using a control process ⋮ Divergence measures between populations: applications in the exponential family ⋮ Goodness-of-fit tests based on Rao's divergence under sparseness assumptions ⋮ Connections of generalized divergence measures with Fisher information matrix ⋮ A New Fuzzy Information Inequalities and its Applications in Establishing Relation among Fuzzy $f$-Divergence Measures ⋮ Some statistical applications of generalized Jensen difference divergence measures for fuzzy information systems ⋮ On preferred point geometry in statistics ⋮ A New Information Inequality and Its Application in Establishing Relation Among Various f-Divergence Measures ⋮ Model checking in loglinear models using \(\phi\)-divergences and MLEs ⋮ Trigonometric entropies, Jensen difference divergence measures, and error bounds ⋮ (h, ø)‐information measure as a criterion of comparison of experiments in a Bayesian context ⋮ Asymptotic normality for the \(K_{\phi}\)-divergence goodness-of-fit tests
This page was built for publication: On the convexity of some divergence measures based on entropy functions