Mutual information, metric entropy and cumulative relative entropy risk (Q1383090)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Mutual information, metric entropy and cumulative relative entropy risk
scientific article

    Statements

    Mutual information, metric entropy and cumulative relative entropy risk (English)
    0 references
    0 references
    0 references
    21 September 1999
    0 references
    Given a sample of size \(n\) distributed according \(P_{\theta^*}\) let \(\widehat P_t\) be an estimate of \(P_{\theta^*}\) based on the observations up to prime \(t\leq n\). The authors measure the total risk up to time \(n\) by the Kullback-Leibler divergence \(R_{n,\widehat P}(\theta^*)=D_{KL} (P^n_{\theta^*} \|\widehat P)\) where \(\widehat P\) is the product measure of the \(\widehat P_t\). Using this risk the authors introduce the minimax as well as the Bayes strategy. It turns out that the Bayes risk is the mutual information \(I (\Theta^*;Y^n)\) of the random parameter \(\Theta^*\) and the sample \(Y^n\). Let \(M_{n,\mu}\) be the marginal distribution of \(Y^n\) if \(\Theta^*\) has distribution \(\mu\). The main results in the first part of the paper concern two-sided bounds of \(I(\Theta^*;Y^n)\) and \(D_{KL}(P^n_{\theta^*}\| M_{n,\mu})\) in terms of Renyi distances \(I_\alpha (P_{\theta^*}, P_{\tilde\theta})\) of order \(\alpha\) and \(I_1 (P_{\theta^*},Q^{\tilde\theta})\) where \(Q_\theta\) is the distribution of \(Y\) given \(\theta\). For finite parameter set \(I(\Theta^*;Y^n)\) is shown to tend with exponential rate to the entropy \(H(\Theta^*)\). The asymptotics of the minimax risk based on the risk \(R_{n,\widehat P}(\theta^*)\) is studied for any \(\Theta\) in the second part of the paper. The authors use the concepts metric entropy and metric dimension to describe the topological structure of \(\Theta\) and to relate these properties to the minimax risk for large \(n\).
    0 references
    0 references
    0 references
    0 references
    0 references
    Hellinger distance
    0 references
    Kullback-Leibler divergence
    0 references
    Bayes risk
    0 references
    Renyi distances
    0 references
    metric dimension
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references