Minimum mutual information and non-Gaussianity through the maximum entropy method: theory and properties (Q406088): Difference between revisions
From MaRDI portal
Created a new Item |
Changed an Item |
||
Property / review text | |||
Summary: The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), \(I(X,Y)\), between random variables \(X,Y\), which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaussian marginal distributions, it allows for the MI decomposition into two positive terms: the Gaussian MI \((I_g)\), depending upon the Gaussian correlation or the correlation between `Gaussianized variables', and a non-Gaussian MI \((I_{ng})\), coinciding with joint negentropy and depending upon nonlinear correlations. Joint moments of a prescribed total order \(p\) are bounded within a compact set defined by Schwarz-like inequalities, where \(I_{ng}\) grows from zero at the `Gaussian manifold' where moments are those of Gaussian distributions, towards infinity at the set's boundary where a deterministic relationship holds. Sources of joint non-Gaussianity have been systematized by estimating \(I_{ng}\) between the input and output from a nonlinear synthetic channel contaminated by multiplicative and non-Gaussian additive noises for a full range of signal-to-noise ratio \((snr)\) variances. We have studied the effect of varying \(snr\) on \(I_g\) and \(I_{ng}\) under several signal/noise scenarios. | |||
Property / review text: Summary: The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), \(I(X,Y)\), between random variables \(X,Y\), which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaussian marginal distributions, it allows for the MI decomposition into two positive terms: the Gaussian MI \((I_g)\), depending upon the Gaussian correlation or the correlation between `Gaussianized variables', and a non-Gaussian MI \((I_{ng})\), coinciding with joint negentropy and depending upon nonlinear correlations. Joint moments of a prescribed total order \(p\) are bounded within a compact set defined by Schwarz-like inequalities, where \(I_{ng}\) grows from zero at the `Gaussian manifold' where moments are those of Gaussian distributions, towards infinity at the set's boundary where a deterministic relationship holds. Sources of joint non-Gaussianity have been systematized by estimating \(I_{ng}\) between the input and output from a nonlinear synthetic channel contaminated by multiplicative and non-Gaussian additive noises for a full range of signal-to-noise ratio \((snr)\) variances. We have studied the effect of varying \(snr\) on \(I_g\) and \(I_{ng}\) under several signal/noise scenarios. / rank | |||
Normal rank | |||
Property / Mathematics Subject Classification ID | |||
Property / Mathematics Subject Classification ID: 94A17 / rank | |||
Normal rank | |||
Property / Mathematics Subject Classification ID | |||
Property / Mathematics Subject Classification ID: 62B10 / rank | |||
Normal rank | |||
Property / zbMATH DE Number | |||
Property / zbMATH DE Number: 6341008 / rank | |||
Normal rank | |||
Property / zbMATH Keywords | |||
mutual information | |||
Property / zbMATH Keywords: mutual information / rank | |||
Normal rank | |||
Property / zbMATH Keywords | |||
non-gaussianity | |||
Property / zbMATH Keywords: non-gaussianity / rank | |||
Normal rank | |||
Property / zbMATH Keywords | |||
maximum entropy distributions | |||
Property / zbMATH Keywords: maximum entropy distributions / rank | |||
Normal rank | |||
Property / zbMATH Keywords | |||
non-gaussian noise | |||
Property / zbMATH Keywords: non-gaussian noise / rank | |||
Normal rank |
Revision as of 18:29, 29 June 2023
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Minimum mutual information and non-Gaussianity through the maximum entropy method: theory and properties |
scientific article |
Statements
Minimum mutual information and non-Gaussianity through the maximum entropy method: theory and properties (English)
0 references
8 September 2014
0 references
Summary: The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), \(I(X,Y)\), between random variables \(X,Y\), which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaussian marginal distributions, it allows for the MI decomposition into two positive terms: the Gaussian MI \((I_g)\), depending upon the Gaussian correlation or the correlation between `Gaussianized variables', and a non-Gaussian MI \((I_{ng})\), coinciding with joint negentropy and depending upon nonlinear correlations. Joint moments of a prescribed total order \(p\) are bounded within a compact set defined by Schwarz-like inequalities, where \(I_{ng}\) grows from zero at the `Gaussian manifold' where moments are those of Gaussian distributions, towards infinity at the set's boundary where a deterministic relationship holds. Sources of joint non-Gaussianity have been systematized by estimating \(I_{ng}\) between the input and output from a nonlinear synthetic channel contaminated by multiplicative and non-Gaussian additive noises for a full range of signal-to-noise ratio \((snr)\) variances. We have studied the effect of varying \(snr\) on \(I_g\) and \(I_{ng}\) under several signal/noise scenarios.
0 references
mutual information
0 references
non-gaussianity
0 references
maximum entropy distributions
0 references
non-gaussian noise
0 references