Development of two new mean codeword lengths (Q713428)

From MaRDI portal
Revision as of 09:59, 30 January 2024 by Import240129110113 (talk | contribs) (Added link to MaRDI item.)
scientific article
Language Label Description Also known as
English
Development of two new mean codeword lengths
scientific article

    Statements

    Development of two new mean codeword lengths (English)
    0 references
    0 references
    0 references
    29 October 2012
    0 references
    In this paper, the authors introduce two new mean codeword lengths \[ L(\alpha, \beta ) = {{\alpha} \over {1-\alpha}} \log_D \bigg [ {{\sum_{i=1}^n p_i^{{\beta} \over {\alpha}} D^{l_i (1-\alpha)/\alpha}} \over {\sum_{i=1}^n p_i^{\beta/\alpha}}} \bigg ] ; \qquad \alpha > 0 , \,\, \beta > 0, \,\, \alpha \neq 1, \] and \[ L(\beta ) = {{\sum_{i=1}^n p_i^{{\beta} \over {\alpha}} \, l_i} \over {\sum_{i=1}^n p_i^{\beta}}}, \qquad \beta > 0 , \] \noindent where \(l_i\) is the length of the codeword \(x_i\) and \(p_i\) is the probability of the occurrence of the codeword \(x_i\). They prove the following results concerning these new codeword lengths. For all uniquely decipherable codes, the exponentiated mean codeword length \(L(\alpha , \beta )\) satisfies \(E^{\alpha}_{\beta} (P) < L(\alpha , \beta ) < E^{\alpha}_{\beta} (P) + 1\), where \[ E^{\alpha}_{\beta} (P) = {1 \over {\alpha-1}} \log_D\bigg [ {{\big ( \sum_{i=1}^n p_i^{{\beta} \over {\alpha}} \big)^{\alpha} } \over {\sum_{i=1}^n p_i^{\beta}}} \bigg ] \] is Kapur entropy, whereas the lower bound of mean codeword length \(L(\beta )\) lies between \(K^{\beta} (P)\) and \(K^{\beta} (P)+1\), where \[ K^{\beta} (P) = {{\sum_{i=1}^n p_i^{\beta} \, \log_D \bigg ( {{p_i^{\beta}} \over {\sum_{i=1}^n p_i^{\beta}} }\bigg )} \over{\sum_{i=1}^n p_i^{\beta} }}. \]
    0 references
    0 references
    entropy
    0 references
    mean codeword length
    0 references
    uniquely decipherable code
    0 references
    best 1-to-1 code
    0 references

    Identifiers