Application of Hölder's inequality in information theory. (Q1425261)

From MaRDI portal
Revision as of 04:18, 5 March 2024 by Import240304020342 (talk | contribs) (Set profile property.)
scientific article
Language Label Description Also known as
English
Application of Hölder's inequality in information theory.
scientific article

    Statements

    Application of Hölder's inequality in information theory. (English)
    0 references
    0 references
    15 March 2004
    0 references
    The authors introduce two `useful' measures of information, namely \[ H_{\alpha} (P^{\beta}; U) = {1 \over {1-\alpha}} \log \sum_{i=1}^n \left ( {{p_i^{\beta\alpha} u_i } \over {\sum_{i=1}^n p_i^{\beta\alpha} u_i}}\right ) \] and \[ H_R (P; U) = {{R} \over {R-1}} \left [ 1 - \left \{ \sum_{i=1}^n \left ( {{u_i p_i^R} \over {\sum_{i=1}^n u_i p_i^R }} \right ) \right \}^{{1 \over R}} \right ] \] and prove the corresponding coding theorem for the uniquely decipherable codes. There are many misprints in this paper.
    0 references
    coding theorem
    0 references
    exponential mean length of code words
    0 references
    Hölder inequality
    0 references
    information measure
    0 references
    mean code words length
    0 references
    uniquely decipherable code
    0 references

    Identifiers