Generalizations of Entropy and Information Measures
From MaRDI portal
Publication:2790447
DOI10.1007/978-3-319-18275-9_22zbMath1368.94061OpenAlexW2246385381MaRDI QIDQ2790447
Thomas L. Toulias, Christos P. Kitsos
Publication date: 4 March 2016
Published in: Computation, Cryptography, and Network Security (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/978-3-319-18275-9_22
Shannon entropyRényi entropygeneralized normal distributionFisher's entropy type information measureSDL complexity
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- A generalized normal distribution
- Superadditivity of Fisher's information and logarithmic Sobolev inequalities
- Cryptology: methods and maxims
- Multivariate \(\theta\)-generalized normal distributions
- On logarithmic Sobolev inequalities for higher order fractional derivatives
- Nonlinear diffusions, hypercontractivity and the optimal \(L^{p}\)-Euclidean logarithmic Sobolev inequality
- Brain electrical activity analysis using wavelet-based informational tools. II: Tsallis non-extensivity and complexity measures
- New parametric measures of information
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
- Logarithmic Sobolev Inequalities
- The Kotz-type distribution with applications
- Logarithmic Sobolev Inequalities for Information Measures
- The convolution inequality for entropy powers
- Elements of Information Theory
This page was built for publication: Generalizations of Entropy and Information Measures