Development of two new mean codeword lengths (Q713428): Difference between revisions
From MaRDI portal
Added link to MaRDI item. |
Removed claim: reviewed by (P1447): Item:Q590186 |
||
Property / reviewed by | |||
Property / reviewed by: Q165921 / rank | |||
Revision as of 12:19, 16 February 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Development of two new mean codeword lengths |
scientific article |
Statements
Development of two new mean codeword lengths (English)
0 references
29 October 2012
0 references
In this paper, the authors introduce two new mean codeword lengths \[ L(\alpha, \beta ) = {{\alpha} \over {1-\alpha}} \log_D \bigg [ {{\sum_{i=1}^n p_i^{{\beta} \over {\alpha}} D^{l_i (1-\alpha)/\alpha}} \over {\sum_{i=1}^n p_i^{\beta/\alpha}}} \bigg ] ; \qquad \alpha > 0 , \,\, \beta > 0, \,\, \alpha \neq 1, \] and \[ L(\beta ) = {{\sum_{i=1}^n p_i^{{\beta} \over {\alpha}} \, l_i} \over {\sum_{i=1}^n p_i^{\beta}}}, \qquad \beta > 0 , \] \noindent where \(l_i\) is the length of the codeword \(x_i\) and \(p_i\) is the probability of the occurrence of the codeword \(x_i\). They prove the following results concerning these new codeword lengths. For all uniquely decipherable codes, the exponentiated mean codeword length \(L(\alpha , \beta )\) satisfies \(E^{\alpha}_{\beta} (P) < L(\alpha , \beta ) < E^{\alpha}_{\beta} (P) + 1\), where \[ E^{\alpha}_{\beta} (P) = {1 \over {\alpha-1}} \log_D\bigg [ {{\big ( \sum_{i=1}^n p_i^{{\beta} \over {\alpha}} \big)^{\alpha} } \over {\sum_{i=1}^n p_i^{\beta}}} \bigg ] \] is Kapur entropy, whereas the lower bound of mean codeword length \(L(\beta )\) lies between \(K^{\beta} (P)\) and \(K^{\beta} (P)+1\), where \[ K^{\beta} (P) = {{\sum_{i=1}^n p_i^{\beta} \, \log_D \bigg ( {{p_i^{\beta}} \over {\sum_{i=1}^n p_i^{\beta}} }\bigg )} \over{\sum_{i=1}^n p_i^{\beta} }}. \]
0 references
entropy
0 references
mean codeword length
0 references
uniquely decipherable code
0 references
best 1-to-1 code
0 references