Mutual Information and Minimum Mean-Square Error in Gaussian Channels

From MaRDI portal
Publication:3546951


DOI10.1109/TIT.2005.844072zbMath1309.94099MaRDI QIDQ3546951

Dongning Guo, Sergio Verdú, Shlomo Shamai

Publication date: 21 December 2008

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1109/tit.2005.844072


94A12: Signal theory (characterization, reconstruction, filtering, etc.)

94A40: Channel models (including quantum) in information and communication theory


Related Items

Asymptotic mutual information for the balanced binary stochastic block model, Community Detection and Stochastic Block Models, ENTROPY FLOW AND DE BRUIJN'S IDENTITY FOR A CLASS OF STOCHASTIC DIFFERENTIAL EQUATIONS DRIVEN BY FRACTIONAL BROWNIAN MOTION, The adaptive interpolation method for proving replica formulas. Applications to the Curie–Weiss and Wigner spike models, Information theoretic limits of learning a sparse rule, Perturbative construction of mean-field equations in extensive-rank matrix factorization and denoising, Information-Theoretic Bounds and Approximations in Neural Population Coding, A DE BRUIJN'S IDENTITY FOR DEPENDENT RANDOM VARIABLES BASED ON COPULA THEORY, Relations Between Information and Estimation in the Presence of Feedback, A stochastic successive minimization method for nonsmooth nonconvex optimization with applications to transceiver design in wireless communication networks, Minimum mutual information and non-Gaussianity through the maximum entropy method: theory and properties, An integral representation of the relative entropy, Is mutual information adequate for feature selection in regression?, Heat equation and convolution inequalities, Quantifying information transmission in eukaryotic gradient sensing and chemotactic response, Continuous trajectory planning of mobile sensors for informative forecasting, Second-order converses via reverse hypercontractivity, Informative windowed forecasting of continuous-time linear systems for mutual information-based sensor planning, A concavity property for the reciprocal of Fisher information and its consequences on Costa's EPI, A Stein deficit for the logarithmic Sobolev inequality, Fundamental limits of symmetric low-rank matrix estimation, Strong replica symmetry in high-dimensional optimal Bayesian inference, Statistical limits of spiked tensor models, Multi-target robust waveform design based on harmonic variance and mutual information, The information-theoretic meaning of Gagliardo-Nirenberg type inequalities, On signalling and estimation limits for molecular birth-processes, Some relations between mutual information and estimation error in Wiener space, Estimation of low-rank matrices via approximate message passing, An information-percolation bound for spin synchronization on general graphs, Statistical thresholds for tensor PCA, Parametric Regularity of the Conditional Expectations via the Malliavin Calculus and Applications, Almost Perfect Privacy for Additive Gaussian Privacy Filters, Mutual information for stochastic differential equations driven by fractional Brownian motion, REMARKS ON A SEMICIRCULAR PERTURBATION OF THE FREE FISHER INFORMATION, An extended orthogonal forward regression algorithm for system identification using entropy