Minimum mutual information and non-Gaussianity through the maximum entropy method: estimation from finite samples (Q742670): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
ReferenceBot (talk | contribs)
Changed an Item
 
(4 intermediate revisions by 4 users not shown)
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.3390/e15030721 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2024993643 / rank
 
Normal rank
Property / Wikidata QID
 
Property / Wikidata QID: Q64391353 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Mathematical Theory of Communication / rank
 
Normal rank
Property / cites work
 
Property / cites work: Entropy and information approaches to genetic diversity and its expression: genomic geography / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimum mutual information and non-Gaussianity through the maximum entropy method: theory and properties / rank
 
Normal rank
Property / cites work
 
Property / cites work: Estimation of Entropy and Mutual Information / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4395705 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Entropy estimates of small data sets / rank
 
Normal rank
Property / cites work
 
Property / cites work: An introduction to copulas. Properties and applications / rank
 
Normal rank
Property / cites work
 
Property / cites work: Mutual information is copula entropy / rank
 
Normal rank
Property / cites work
 
Property / cites work: Distribution of mutual information from complete and incomplete data / rank
 
Normal rank
Property / cites work
 
Property / cites work: Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy / rank
 
Normal rank
Property / cites work
 
Property / cites work: Entropy densities with an application to autoregressive conditional skewness and kurtosis. / rank
 
Normal rank
links / mardi / namelinks / mardi / name
 

Latest revision as of 02:14, 9 July 2024

scientific article
Language Label Description Also known as
English
Minimum mutual information and non-Gaussianity through the maximum entropy method: estimation from finite samples
scientific article

    Statements

    Minimum mutual information and non-Gaussianity through the maximum entropy method: estimation from finite samples (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    19 September 2014
    0 references
    Summary: The Minimum Mutual Information (MinMI) Principle provides the least committed, maximum-joint-entropy (ME) inferential law that is compatible with prescribed marginal distributions and empirical cross constraints. Here, we estimate MI bounds (the MinMI values) generated by constraining sets \(\mathbf T_{cr}\) comprehended by \(m_{cr}\) linear and/or nonlinear joint expectations, computed from samples of \(N\) iid outcomes. Marginals (and their entropy) are imposed by single morphisms of the original random variables. \(N\)-asymptotic formulas are given both for the distribution of cross expectation's estimation errors, the MinMI estimation bias, its variance and distribution. A growing \(\mathbf T_{cr}\) leads to an increasing MinMI, converging eventually to the total MI. Under \(N\)-sized samples, the MinMI increment relative to two encapsulated sets \(\mathbf T_{cr1}\subset\mathbf T_{cr2}\) (with numbers of constraints \(m_{cr1}<m_{cr2}\)) is the test-difference \(\delta H=H_{\max 1,N}-H_{\max 2,N}\geq 0\) between the two respective estimated MEs. Asymptotically, \(\delta H\) follows a Chi-Squared distribution \(\frac{1}{2N}\chi^2_{(m_{cr2}-m_{cr1})}\) whose upper quantiles determine if constraints in \(\mathbf T_{cr2}/\mathbf T_{cr1}\) explain significant extra MI. As an example, we have set marginals to being normally distributed (Gaussian) and have built a sequence of MI bounds, associated to successive non-linear correlations due to joint non-Gaussianity. Noting that in real-world situations available sample sizes can be rather low, the relationship between MinMI bias, probability density over-fitting and outliers is put in evidence for under-sampled data.
    0 references
    0 references
    mutual information
    0 references
    non-Gaussianity
    0 references
    maximum entropy distributions
    0 references
    entropy bias
    0 references
    mutual information distribution
    0 references
    morphism
    0 references
    0 references
    0 references