On the convexity of some divergence measures based on entropy functions

From MaRDI portal
Publication:3937278

DOI10.1109/TIT.1982.1056497zbMath0479.94009OpenAlexW2070134780MaRDI QIDQ3937278

Jacob Burbea, C. Radhakrishna Rao

Publication date: 1982

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1109/tit.1982.1056497




Related Items (96)

On choosing a goodness‐of‐fit test for discrete multivariate dataEvolution of the global inequality in greenhouse gases emissions using multidimensional generalized entropy measuresAsymptotic approximations for the distributions of the \(K\phi\)-divergence goodness-of-fit statisticsUnnamed ItemLikelihood Divergence Statistics for Testing Hypotheses in Familial DataRAO'S STATISTIC FOR THE ANALYSIS OF UNIFORM ASSOCIATION IN CROSS-CLASSIFICATIONSAnalysis of symbolic sequences using the Jensen-Shannon divergenceOrdering and selecting extreme populations by means of entropies and divergencesStatistical aspects of divergence measuresNew bounds for Shannon, relative and Mandelbrot entropies via Hermite interpolating polynomialApproximations of Jensen divergence for twice differentiable functionsExtension of some results for channel capacity using a generalized information measureBounds for \(f\)-divergences under likelihood ratio constraints.Rényi information measure for a used itemON SOME INEQUALITIES AND GENERALIZED ENTROPIES: A UNIFIED APPROACOn the parameter space derived from the joint probability density functions and the property of its scalar curvatureDivergence statistics based on entropy functions and stratified samplingA family of generalized quantum entropies: definition and propertiesGeneralized divergences from generalized entropiesReducing complexity in polygonal meshes with view-based saliencyA comparison of some estimators of the mixture proportion of mixed normal distributionsEntropy differential metric, distance and divergence measures in probability spaces: A unified approachA brief biography and appreciation of Calyampudi Radhakrishna Rao, with a bibliography of his books and papersMetrics induced by Jensen-Shannon and related divergences on positive definite matricesNew converses of Jensen inequality via Green functions with applicationsReverses of the Jensen inequality in terms of first derivative and applicationsJensen-Renyi's-Tsallis fuzzy divergence information measure with its applicationsSkew Jensen-Bregman Voronoi DiagramsTsallis mutual information for document classificationGeneral Lebesgue integral inequalities of Jensen and Ostrowski type for differentiable functions whose derivatives in absolute value are h-convex and applicationsShannon's Entropy and Its Generalisations Towards Statistical Inference in Last Seven DecadesSimple comparison of atomic population and shape atomic populations distributions between two molecular structures with a coherent number of atomsDivergence-based tests for the bivariate gamma distribution applied to polarimetric synthetic aperture radarHermite-Hadamard type inequalities with applicationsExtended fractional cumulative past and paired \(\phi\)-entropy measuresQuadratic entropy and analysis of diversityRényi statistics for testing hypotheses in mixed linear regression modelsOn testing hypotheses with divergence statisticsBounds on the probability of error in terms of generalized information radiiGeneralization of some bounds containing entropies on time scalesUnnamed ItemNested models for categorical dataFormulas for Rényi information and related measures for univariate distributions.Income distributions and decomposable divergence measuresNew estimates for Csiszár divergence and Zipf-Mandelbrot entropy via Jensen-Mercer's inequalityChernoff distance for truncated distributionsCertainty equivalents and information measures: Duality and extremal principlesBurbea-Rao divergence based statistics for testing uniform association\((R,S)\)-information radius of type \(t\) and comparison of experimentsThe Jensen-Shannon divergenceGamma process-based models for disease progressionLearning decision trees with taxonomy of propositionalized attributesSOME REVERSES OF THE JENSEN INEQUALITY WITH APPLICATIONSOn the \(J\)-divergence of intuitionistic fuzzy sets with its application to pattern recognitionA generalized model for the analysis of association in ordinal contingency tablesExpressions for Rényi and Shannon entropies for bivariate distributionsA symmetric information divergence measure of the Csiszár's \(f\)-divergence class and its boundsMinimum \(K_\phi\)-divergence estimator.Minimum \(K_\phi\)-divergence estimators for multinomial models and applicationsExpressions for Rényi and Shannon entropies for multivariate distributionsGeneralized Symmetric Divergence Measures and the Probability of ErrorEstimating Rao's statistic distribution for testing uniform association in cross-classificationsGeneralized arithmetic and geometric mean divergence measure and their statistical aspectsThe \(K_{\varphi}\)-divergence statistic for categorical data problemsEntropy measures associated with j-divergencesGeneralized arithmetic and geometric mean divergence measure and their statistical aspectsA Survey of Reverse Inequalities for f-Divergence Measure in Information TheoryChernoff distance for conditionally specified modelsSpectral dimensionality reduction for Bregman informationRao's statistic for constant and proportional hazard modelsConverse to the Sherman inequality with applicationsTesting stationary distributions of Markov chains based on Rao's divergence\((h,\Psi)\)-entropy differential metricON PARAMETRIZED HERMITE-HADAMARD TYPE INEQUALITIESImproving the accuracy of goodness-of-fit tests based on Rao's divergence with small sample size.Unnamed ItemUnnamed ItemUnnamed ItemUnnamed ItemUnnamed ItemOn Burbea-Rao divergence based goodness-of-fit tests for multinomial modelsGoodness-of-fit tests for stationary distributions of Markov chains based on Rao's divergenceHermite-Hadamard trapezoid and mid-point divergencesMetric divergence measures and information value in credit scoringHierarchical clustering based on the information bottleneck method using a control processDivergence measures between populations: applications in the exponential familyGoodness-of-fit tests based on Rao's divergence under sparseness assumptionsConnections of generalized divergence measures with Fisher information matrixA New Fuzzy Information Inequalities and its Applications in Establishing Relation among Fuzzy $f$-Divergence MeasuresSome statistical applications of generalized Jensen difference divergence measures for fuzzy information systemsOn preferred point geometry in statisticsA New Information Inequality and Its Application in Establishing Relation Among Various f-Divergence MeasuresModel checking in loglinear models using \(\phi\)-divergences and MLEsTrigonometric entropies, Jensen difference divergence measures, and error bounds(h, ø)‐information measure as a criterion of comparison of experiments in a Bayesian contextAsymptotic normality for the \(K_{\phi}\)-divergence goodness-of-fit tests




This page was built for publication: On the convexity of some divergence measures based on entropy functions