A coding theorem and Rényi's entropy

From MaRDI portal
Publication:5512654

DOI10.1016/S0019-9958(65)90332-3zbMath0138.15103OpenAlexW2001935257MaRDI QIDQ5512654

L. Lorne Campbell

Publication date: 1965

Published in: Information and Control (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1016/s0019-9958(65)90332-3




Related Items (39)

Definition of entropy by means of a coding problemOn \(q\)-non-extensive statistics with non-Tsallisian entropyThe relation between information theory and the differential geometry approach to statisticsRole of information theoretic uncertainty relations in quantum theory(R, S)-Norm Information Measure and A Relation Between Coding and Questionnaire TheoryA NOTE ON RÉNYI'S ENTROPY RATE FOR TIME-INHOMOGENEOUS MARKOV CHAINSON SOME INEQUALITIES AND GENERALIZED ENTROPIES: A UNIFIED APPROACDifferential-escort transformations and the monotonicity of the LMC-Rényi complexity measureUnnamed ItemSome properties of Rényi entropy over countably infinite alphabetsSome coding theorems for nonadditive generalized mean-value entropiesAspects concerning entropy and utilityTsallis entropy measure of noise-aided information transmission in a binary channelApplication of Hölder's inequality in information theory.Trees with exponentially growing costsRENYI ENTROPY OF MAPS: APPLICATIONS TO FUZZY SETS. PATTERN RECOGNITION, AND CHAOTIC DYNAMICSSource coding with escort distributions and Rényi entropy boundsGelfand-Yaglom-Perez theorem for generalized relative entropy functionalsNew property of a generalized Hölder's inequality and its applicationsSome inequalities in information theory using Tsallis entropyComparative study of generalized quantitative-qualitative inaccuracy fuzzy measures for noiseless coding theorem and 1:1 codesA joint representation of Rényi's and Tsalli's entropy with application in coding theoryError bounds for high-resolution quantization with Rényi-\(\alpha\)-entropy constraintsSome source coding theorems and 1:1 coding based on generalized inaccuracy measure of order \(\alpha \) and type \(\beta \)On Noiseless CodingCramér-Rao lower bounds arising from generalized Csiszár divergencesDetermination of all additive quasiarithmetic mean codeword lengthsDevelopment of two new mean codeword lengthsMeasuring statistical dependences in a time seriesAlphabetic coding with exponential costsMean EntropiesGeneralized entropies in coding theoryA class of measures of informativity of observation channelsLarge deviations for conditional guessworkMeasuring information beyond communication theory - why some generalized information measures may be useful, others notRelations between the observational entropy and Rényi information measuresA formulation of Rényi entropy on \(C^\ast\)-algebrasTrigonometric entropies, Jensen difference divergence measures, and error boundsRydberg multidimensional states: Rényi and Shannon entropies in momentum space




This page was built for publication: A coding theorem and Rényi's entropy