Some bounds on entropy measures in information theory
From MaRDI portal
Publication:1372258
DOI10.1016/S0893-9659(97)00028-1zbMATH Open0906.94004MaRDI QIDQ1372258FDOQ1372258
Authors: C. J. Goh, Silvestru Sever Dragomir
Publication date: 23 February 1999
Published in: Applied Mathematics Letters (Search for Journal in Brave)
Recommendations
Measures of information, entropy (94A17) Inequalities involving derivatives and differential and integral operators (26D10) Convex functions and convex programs in convex geometry (52A41)
Cites Work
Cited In (52)
- Title not available (Why is that?)
- New Bounds for the Jensen-Dragomir Functional with Applications in Analysis
- The asymptotic bound for quasi fuzzy entropy
- A note on the upper bound for the difference between two entropies
- Title not available (Why is that?)
- Title not available (Why is that?)
- Bounds for degree-based network entropies
- A refinement of the Jensen-Simic-Mercer inequality with applications to entropy
- A survey of reverse inequalities for \(f\)-divergence measure in information theory
- Entropy bounds for dendrimers
- Title not available (Why is that?)
- A history of graph entropy measures
- Bounds on the entropy series
- A new entropy upper bound
- Extremality of degree-based graph entropies
- Information entropy of a discrete information source with \(g_\lambda\) distribution and its application
- Title not available (Why is that?)
- Inequalities for entropy-based measures of network information content
- Best possible global bounds for Jensen's inequality
- Probabilistic inequalities for evaluating structural network measures
- Entropy lower bounds related to a problem of universal coding and prediction
- Some upper bounds for relative entropy and applications
- Entropy and expansion
- New entropy bounds via uniformly convex functions
- A von Neumann entropy condition of unitary equivalence of quantum states
- Jensen-Mercer inequality for uniformly convex functions with some applications
- Bounds for \(f\)-divergences under likelihood ratio constraints.
- Jensen's inequality and \(tgs\)-convex functions with applications
- Inequalities for discrete \(f\)-divergence measures: a survey of recent results
- Entropy of weighted graphs with Randić weights
- fgh -Convex Functions and Entropy Bounds
- Degree-based entropies of networks revisited
- A Note on Hoeffding's Inequality
- A new refinement of Jensen-type inequality with respect to uniformly convex functions with applications in information theory
- Bounds on entropy in a guessing game
- On mm-Entropy of a Banach Space with Gaussian Measure
- Refinements of some bounds in information theory
- Some properties of the relative entropy density of arbitrary information source
- First degree-based entropy of graphs
- Network entropies based on independent sets and matchings
- Bounds for Kullback-Leibler divergence
- Fredman–Komlós bounds and information theory
- Characterizations of von Neumann entropy and Tsallis \(p\)-entropy on quantum states
- Sharp global bounds for Jensen's inequality
- Title not available (Why is that?)
- Best possible global bounds for Jensen functional
- An application of Ky Fan inequality: on Kullback-Leibler divergence between a probability distribution and its negation
- An extension of Jensen-Mercer inequality with applications to entropy
- Jensen's inequality and new entropy bounds
- On new refinement of the Jensen inequality using uniformly convex functions with applications
- On monotonicity and superadditivity properties of the entropy function
- A Probabilistic Upper Bound on Differential Entropy
This page was built for publication: Some bounds on entropy measures in information theory
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1372258)