A coding theorem and Rényi's entropy
From MaRDI portal
Publication:5512654
DOI10.1016/S0019-9958(65)90332-3zbMath0138.15103OpenAlexW2001935257MaRDI QIDQ5512654
Publication date: 1965
Published in: Information and Control (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/s0019-9958(65)90332-3
Related Items (39)
Definition of entropy by means of a coding problem ⋮ On \(q\)-non-extensive statistics with non-Tsallisian entropy ⋮ The relation between information theory and the differential geometry approach to statistics ⋮ Role of information theoretic uncertainty relations in quantum theory ⋮ (R, S)-Norm Information Measure and A Relation Between Coding and Questionnaire Theory ⋮ A NOTE ON RÉNYI'S ENTROPY RATE FOR TIME-INHOMOGENEOUS MARKOV CHAINS ⋮ ON SOME INEQUALITIES AND GENERALIZED ENTROPIES: A UNIFIED APPROAC ⋮ Differential-escort transformations and the monotonicity of the LMC-Rényi complexity measure ⋮ Unnamed Item ⋮ Some properties of Rényi entropy over countably infinite alphabets ⋮ Some coding theorems for nonadditive generalized mean-value entropies ⋮ Aspects concerning entropy and utility ⋮ Tsallis entropy measure of noise-aided information transmission in a binary channel ⋮ Application of Hölder's inequality in information theory. ⋮ Trees with exponentially growing costs ⋮ RENYI ENTROPY OF MAPS: APPLICATIONS TO FUZZY SETS. PATTERN RECOGNITION, AND CHAOTIC DYNAMICS ⋮ Source coding with escort distributions and Rényi entropy bounds ⋮ Gelfand-Yaglom-Perez theorem for generalized relative entropy functionals ⋮ New property of a generalized Hölder's inequality and its applications ⋮ Some inequalities in information theory using Tsallis entropy ⋮ Comparative study of generalized quantitative-qualitative inaccuracy fuzzy measures for noiseless coding theorem and 1:1 codes ⋮ A joint representation of Rényi's and Tsalli's entropy with application in coding theory ⋮ Error bounds for high-resolution quantization with Rényi-\(\alpha\)-entropy constraints ⋮ Some source coding theorems and 1:1 coding based on generalized inaccuracy measure of order \(\alpha \) and type \(\beta \) ⋮ On Noiseless Coding ⋮ Cramér-Rao lower bounds arising from generalized Csiszár divergences ⋮ Determination of all additive quasiarithmetic mean codeword lengths ⋮ Development of two new mean codeword lengths ⋮ Measuring statistical dependences in a time series ⋮ Alphabetic coding with exponential costs ⋮ Mean Entropies ⋮ Generalized entropies in coding theory ⋮ A class of measures of informativity of observation channels ⋮ Large deviations for conditional guesswork ⋮ Measuring information beyond communication theory - why some generalized information measures may be useful, others not ⋮ Relations between the observational entropy and Rényi information measures ⋮ A formulation of Rényi entropy on \(C^\ast\)-algebras ⋮ Trigonometric entropies, Jensen difference divergence measures, and error bounds ⋮ Rydberg multidimensional states: Rényi and Shannon entropies in momentum space
This page was built for publication: A coding theorem and Rényi's entropy