Pages that link to "Item:Q3937278"
From MaRDI portal
The following pages link to On the convexity of some divergence measures based on entropy functions (Q3937278):
Displaying 50 items.
- A family of generalized quantum entropies: definition and properties (Q331921) (← links)
- Reverses of the Jensen inequality in terms of first derivative and applications (Q384115) (← links)
- Tsallis mutual information for document classification (Q400940) (← links)
- Estimating Rao's statistic distribution for testing uniform association in cross-classifications (Q554524) (← links)
- Entropy differential metric, distance and divergence measures in probability spaces: A unified approach (Q594457) (← links)
- Income distributions and decomposable divergence measures (Q654518) (← links)
- Chernoff distance for truncated distributions (Q657075) (← links)
- The Jensen-Shannon divergence (Q677507) (← links)
- Ordering and selecting extreme populations by means of entropies and divergences (Q843140) (← links)
- Bounds for \(f\)-divergences under likelihood ratio constraints. (Q851578) (← links)
- Bounds on the probability of error in terms of generalized information radii (Q912072) (← links)
- Learning decision trees with taxonomy of propositionalized attributes (Q955830) (← links)
- Trigonometric entropies, Jensen difference divergence measures, and error bounds (Q1069901) (← links)
- Statistical aspects of divergence measures (Q1096987) (← links)
- Extension of some results for channel capacity using a generalized information measure (Q1101061) (← links)
- Certainty equivalents and information measures: Duality and extremal principles (Q1177028) (← links)
- \((R,S)\)-information radius of type \(t\) and comparison of experiments (Q1184942) (← links)
- A generalized model for the analysis of association in ordinal contingency tables (Q1205459) (← links)
- \((h,\Psi)\)-entropy differential metric (Q1265614) (← links)
- Improving the accuracy of goodness-of-fit tests based on Rao's divergence with small sample size. (Q1274827) (← links)
- On Burbea-Rao divergence based goodness-of-fit tests for multinomial models (Q1290938) (← links)
- Goodness-of-fit tests for stationary distributions of Markov chains based on Rao's divergence (Q1291566) (← links)
- Connections of generalized divergence measures with Fisher information matrix (Q1310926) (← links)
- Some statistical applications of generalized Jensen difference divergence measures for fuzzy information systems (Q1311716) (← links)
- Divergence statistics based on entropy functions and stratified sampling (Q1358834) (← links)
- A comparison of some estimators of the mixture proportion of mixed normal distributions (Q1372086) (← links)
- Formulas for Rényi information and related measures for univariate distributions. (Q1425295) (← links)
- On preferred point geometry in statistics (Q1598689) (← links)
- Model checking in loglinear models using \(\phi\)-divergences and MLEs (Q1600735) (← links)
- Asymptotic normality for the \(K_{\phi}\)-divergence goodness-of-fit tests (Q1612405) (← links)
- Evolution of the global inequality in greenhouse gases emissions using multidimensional generalized entropy measures (Q1619025) (← links)
- New bounds for Shannon, relative and Mandelbrot entropies via Hermite interpolating polynomial (Q1639678) (← links)
- Hermite-Hadamard type inequalities with applications (Q1691461) (← links)
- Minimum \(K_\phi\)-divergence estimator. (Q1764520) (← links)
- Expressions for Rényi and Shannon entropies for multivariate distributions (Q1767750) (← links)
- Chernoff distance for conditionally specified models (Q1785816) (← links)
- Testing stationary distributions of Markov chains based on Rao's divergence (Q1809012) (← links)
- Goodness-of-fit tests based on Rao's divergence under sparseness assumptions (Q1855740) (← links)
- Asymptotic approximations for the distributions of the \(K\phi\)-divergence goodness-of-fit statistics (Q1880273) (← links)
- On the parameter space derived from the joint probability density functions and the property of its scalar curvature (Q1901401) (← links)
- A brief biography and appreciation of Calyampudi Radhakrishna Rao, with a bibliography of his books and papers (Q1914216) (← links)
- Simple comparison of atomic population and shape atomic populations distributions between two molecular structures with a coherent number of atoms (Q1936783) (← links)
- Nested models for categorical data (Q1961823) (← links)
- Hermite-Hadamard trapezoid and mid-point divergences (Q2080470) (← links)
- Generalized divergences from generalized entropies (Q2153434) (← links)
- New converses of Jensen inequality via Green functions with applications (Q2175893) (← links)
- New estimates for Csiszár divergence and Zipf-Mandelbrot entropy via Jensen-Mercer's inequality (Q2221991) (← links)
- Burbea-Rao divergence based statistics for testing uniform association (Q2229821) (← links)
- Gamma process-based models for disease progression (Q2241509) (← links)
- Minimum \(K_\phi\)-divergence estimators for multinomial models and applications (Q2259723) (← links)