Minimum mutual information and non-Gaussianity through the maximum entropy method: theory and properties (Q406088): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
Created claim: DBLP publication ID (P1635): journals/entropy/PiresP12, #quickstatements; #temporary_batch_1731543907597
 
(7 intermediate revisions by 7 users not shown)
Property / review text
 
Summary: The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), \(I(X,Y)\), between random variables \(X,Y\), which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaussian marginal distributions, it allows for the MI decomposition into two positive terms: the Gaussian MI \((I_g)\), depending upon the Gaussian correlation or the correlation between `Gaussianized variables', and a non-Gaussian MI \((I_{ng})\), coinciding with joint negentropy and depending upon nonlinear correlations. Joint moments of a prescribed total order \(p\) are bounded within a compact set defined by Schwarz-like inequalities, where \(I_{ng}\) grows from zero at the `Gaussian manifold' where moments are those of Gaussian distributions, towards infinity at the set's boundary where a deterministic relationship holds. Sources of joint non-Gaussianity have been systematized by estimating \(I_{ng}\) between the input and output from a nonlinear synthetic channel contaminated by multiplicative and non-Gaussian additive noises for a full range of signal-to-noise ratio \((snr)\) variances. We have studied the effect of varying \(snr\) on \(I_g\) and \(I_{ng}\) under several signal/noise scenarios.
Property / review text: Summary: The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), \(I(X,Y)\), between random variables \(X,Y\), which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaussian marginal distributions, it allows for the MI decomposition into two positive terms: the Gaussian MI \((I_g)\), depending upon the Gaussian correlation or the correlation between `Gaussianized variables', and a non-Gaussian MI \((I_{ng})\), coinciding with joint negentropy and depending upon nonlinear correlations. Joint moments of a prescribed total order \(p\) are bounded within a compact set defined by Schwarz-like inequalities, where \(I_{ng}\) grows from zero at the `Gaussian manifold' where moments are those of Gaussian distributions, towards infinity at the set's boundary where a deterministic relationship holds. Sources of joint non-Gaussianity have been systematized by estimating \(I_{ng}\) between the input and output from a nonlinear synthetic channel contaminated by multiplicative and non-Gaussian additive noises for a full range of signal-to-noise ratio \((snr)\) variances. We have studied the effect of varying \(snr\) on \(I_g\) and \(I_{ng}\) under several signal/noise scenarios. / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 94A17 / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 62B10 / rank
 
Normal rank
Property / zbMATH DE Number
 
Property / zbMATH DE Number: 6341008 / rank
 
Normal rank
Property / zbMATH Keywords
 
mutual information
Property / zbMATH Keywords: mutual information / rank
 
Normal rank
Property / zbMATH Keywords
 
non-gaussianity
Property / zbMATH Keywords: non-gaussianity / rank
 
Normal rank
Property / zbMATH Keywords
 
maximum entropy distributions
Property / zbMATH Keywords: maximum entropy distributions / rank
 
Normal rank
Property / zbMATH Keywords
 
non-gaussian noise
Property / zbMATH Keywords: non-gaussian noise / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: Mathematica / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / Wikidata QID
 
Property / Wikidata QID: Q64391355 / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.3390/e14061103 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2049420890 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Independent coordinates for strange attractors from mutual information / rank
 
Normal rank
Property / cites work
 
Property / cites work: A mathematical framework for quantifying predictability through relative entropy / rank
 
Normal rank
Property / cites work
 
Property / cites work: Entropy expressions for multivariate continuous distributions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Expressions for Rényi and Shannon entropies for bivariate distributions / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Convergent Iterative Procedure for Constructing Bivariate Distributions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Multivariate maximum entropy identification, transformation, and dependence / rank
 
Normal rank
Property / cites work
 
Property / cites work: An improved algorithm for the multidimensional moment-constrained maximum entropy problem / rank
 
Normal rank
Property / cites work
 
Property / cites work: The multidimensional moment-constrained maximum entropy problem: A BFGS algorithm with constraint scaling / rank
 
Normal rank
Property / cites work
 
Property / cites work: The multidimensional maximum entropy moment problem: a review on numerical methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Entropy densities with an application to autoregressive conditional skewness and kurtosis. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Mutual Information and Minimum Mean-Square Error in Gaussian Channels / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimum mutual information and non-Gaussianity through the maximum entropy method: theory and properties / rank
 
Normal rank
Property / cites work
 
Property / cites work: Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy / rank
 
Normal rank
Property / cites work
 
Property / cites work: Constructing multivariate distributions with specific marginal distributions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Edgeworth Approximation of Multivariate Differential Entropy / rank
 
Normal rank
Property / cites work
 
Property / cites work: Independent component analysis, a new concept? / rank
 
Normal rank
Property / cites work
 
Property / cites work: Asymptotic Statistics / rank
 
Normal rank
Property / cites work
 
Property / cites work: Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error / rank
 
Normal rank
Property / cites work
 
Property / cites work: Some numerical experiments with variable-storage quasi-Newton algorithms / rank
 
Normal rank
Property / DBLP publication ID
 
Property / DBLP publication ID: journals/entropy/PiresP12 / rank
 
Normal rank
links / mardi / namelinks / mardi / name
 

Latest revision as of 01:25, 14 November 2024

scientific article
Language Label Description Also known as
English
Minimum mutual information and non-Gaussianity through the maximum entropy method: theory and properties
scientific article

    Statements

    Minimum mutual information and non-Gaussianity through the maximum entropy method: theory and properties (English)
    0 references
    0 references
    0 references
    0 references
    8 September 2014
    0 references
    Summary: The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), \(I(X,Y)\), between random variables \(X,Y\), which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaussian marginal distributions, it allows for the MI decomposition into two positive terms: the Gaussian MI \((I_g)\), depending upon the Gaussian correlation or the correlation between `Gaussianized variables', and a non-Gaussian MI \((I_{ng})\), coinciding with joint negentropy and depending upon nonlinear correlations. Joint moments of a prescribed total order \(p\) are bounded within a compact set defined by Schwarz-like inequalities, where \(I_{ng}\) grows from zero at the `Gaussian manifold' where moments are those of Gaussian distributions, towards infinity at the set's boundary where a deterministic relationship holds. Sources of joint non-Gaussianity have been systematized by estimating \(I_{ng}\) between the input and output from a nonlinear synthetic channel contaminated by multiplicative and non-Gaussian additive noises for a full range of signal-to-noise ratio \((snr)\) variances. We have studied the effect of varying \(snr\) on \(I_g\) and \(I_{ng}\) under several signal/noise scenarios.
    0 references
    mutual information
    0 references
    non-gaussianity
    0 references
    maximum entropy distributions
    0 references
    non-gaussian noise
    0 references
    0 references

    Identifiers