Pages that link to "Item:Q5512654"
From MaRDI portal
The following pages link to A coding theorem and Rényi's entropy (Q5512654):
Displaying 39 items.
- Role of information theoretic uncertainty relations in quantum theory (Q307170) (← links)
- Some properties of Rényi entropy over countably infinite alphabets (Q383169) (← links)
- Some coding theorems for nonadditive generalized mean-value entropies (Q417256) (← links)
- Aspects concerning entropy and utility (Q430153) (← links)
- Tsallis entropy measure of noise-aided information transmission in a binary channel (Q434099) (← links)
- New property of a generalized Hölder's inequality and its applications (Q508730) (← links)
- Error bounds for high-resolution quantization with Rényi-\(\alpha\)-entropy constraints (Q532064) (← links)
- Source coding with escort distributions and Rényi entropy bounds (Q665308) (← links)
- Development of two new mean codeword lengths (Q713428) (← links)
- Measuring information beyond communication theory - why some generalized information measures may be useful, others not (Q799647) (← links)
- Trees with exponentially growing costs (Q924721) (← links)
- Alphabetic coding with exponential costs (Q990132) (← links)
- Trigonometric entropies, Jensen difference divergence measures, and error bounds (Q1069901) (← links)
- The relation between information theory and the differential geometry approach to statistics (Q1086923) (← links)
- Application of Hölder's inequality in information theory. (Q1425261) (← links)
- On \(q\)-non-extensive statistics with non-Tsallisian entropy (Q1619068) (← links)
- Some inequalities in information theory using Tsallis entropy (Q1751312) (← links)
- Comparative study of generalized quantitative-qualitative inaccuracy fuzzy measures for noiseless coding theorem and 1:1 codes (Q1751350) (← links)
- A joint representation of Rényi's and Tsalli's entropy with application in coding theory (Q1751509) (← links)
- Cramér-Rao lower bounds arising from generalized Csiszár divergences (Q2006195) (← links)
- Relations between the observational entropy and Rényi information measures (Q2101218) (← links)
- A formulation of Rényi entropy on \(C^\ast\)-algebras (Q2105966) (← links)
- Differential-escort transformations and the monotonicity of the LMC-Rényi complexity measure (Q2156641) (← links)
- Some source coding theorems and 1:1 coding based on generalized inaccuracy measure of order \(\alpha \) and type \(\beta \) (Q2255032) (← links)
- Large deviations for conditional guesswork (Q2322662) (← links)
- Gelfand-Yaglom-Perez theorem for generalized relative entropy functionals (Q2465342) (← links)
- Measuring statistical dependences in a time series (Q2499948) (← links)
- Generalized entropies in coding theory (Q2543577) (← links)
- A class of measures of informativity of observation channels (Q2556388) (← links)
- On Noiseless Coding (Q3328439) (← links)
- Mean Entropies (Q3368413) (← links)
- ON SOME INEQUALITIES AND GENERALIZED ENTROPIES: A UNIFIED APPROAC (Q3728800) (← links)
- RENYI ENTROPY OF MAPS: APPLICATIONS TO FUZZY SETS. PATTERN RECOGNITION, AND CHAOTIC DYNAMICS (Q3801467) (← links)
- (Q3859701) (← links)
- Determination of all additive quasiarithmetic mean codeword lengths (Q4404123) (← links)
- A NOTE ON RÉNYI'S ENTROPY RATE FOR TIME-INHOMOGENEOUS MARKOV CHAINS (Q5056632) (← links)
- (<i>R, S</i>)-Norm Information Measure and A Relation Between Coding and Questionnaire Theory (Q5298329) (← links)
- Definition of entropy by means of a coding problem (Q5553814) (← links)
- Rydberg multidimensional states: Rényi and Shannon entropies in momentum space (Q5876366) (← links)