Minimum mutual information and non-Gaussianity through the maximum entropy method: estimation from finite samples (Q742670)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Minimum mutual information and non-Gaussianity through the maximum entropy method: estimation from finite samples
scientific article

    Statements

    Minimum mutual information and non-Gaussianity through the maximum entropy method: estimation from finite samples (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    19 September 2014
    0 references
    Summary: The Minimum Mutual Information (MinMI) Principle provides the least committed, maximum-joint-entropy (ME) inferential law that is compatible with prescribed marginal distributions and empirical cross constraints. Here, we estimate MI bounds (the MinMI values) generated by constraining sets \(\mathbf T_{cr}\) comprehended by \(m_{cr}\) linear and/or nonlinear joint expectations, computed from samples of \(N\) iid outcomes. Marginals (and their entropy) are imposed by single morphisms of the original random variables. \(N\)-asymptotic formulas are given both for the distribution of cross expectation's estimation errors, the MinMI estimation bias, its variance and distribution. A growing \(\mathbf T_{cr}\) leads to an increasing MinMI, converging eventually to the total MI. Under \(N\)-sized samples, the MinMI increment relative to two encapsulated sets \(\mathbf T_{cr1}\subset\mathbf T_{cr2}\) (with numbers of constraints \(m_{cr1}<m_{cr2}\)) is the test-difference \(\delta H=H_{\max 1,N}-H_{\max 2,N}\geq 0\) between the two respective estimated MEs. Asymptotically, \(\delta H\) follows a Chi-Squared distribution \(\frac{1}{2N}\chi^2_{(m_{cr2}-m_{cr1})}\) whose upper quantiles determine if constraints in \(\mathbf T_{cr2}/\mathbf T_{cr1}\) explain significant extra MI. As an example, we have set marginals to being normally distributed (Gaussian) and have built a sequence of MI bounds, associated to successive non-linear correlations due to joint non-Gaussianity. Noting that in real-world situations available sample sizes can be rather low, the relationship between MinMI bias, probability density over-fitting and outliers is put in evidence for under-sampled data.
    0 references
    0 references
    mutual information
    0 references
    non-Gaussianity
    0 references
    maximum entropy distributions
    0 references
    entropy bias
    0 references
    mutual information distribution
    0 references
    morphism
    0 references
    0 references
    0 references