Pages that link to "Item:Q3546951"
From MaRDI portal
The following pages link to Mutual Information and Minimum Mean-Square Error in Gaussian Channels (Q3546951):
Displayed 37 items.
- A stochastic successive minimization method for nonsmooth nonconvex optimization with applications to transceiver design in wireless communication networks (Q301668) (← links)
- Minimum mutual information and non-Gaussianity through the maximum entropy method: theory and properties (Q406088) (← links)
- An integral representation of the relative entropy (Q406130) (← links)
- Is mutual information adequate for feature selection in regression? (Q460666) (← links)
- Heat equation and convolution inequalities (Q475713) (← links)
- Quantifying information transmission in eukaryotic gradient sensing and chemotactic response (Q540569) (← links)
- Continuous trajectory planning of mobile sensors for informative forecasting (Q608428) (← links)
- Second-order converses via reverse hypercontractivity (Q778887) (← links)
- Informative windowed forecasting of continuous-time linear systems for mutual information-based sensor planning (Q895124) (← links)
- A concavity property for the reciprocal of Fisher information and its consequences on Costa's EPI (Q1618481) (← links)
- A Stein deficit for the logarithmic Sobolev inequality (Q1700699) (← links)
- Fundamental limits of symmetric low-rank matrix estimation (Q1740590) (← links)
- Strong replica symmetry in high-dimensional optimal Bayesian inference (Q2159230) (← links)
- Statistical limits of spiked tensor models (Q2179237) (← links)
- Multi-target robust waveform design based on harmonic variance and mutual information (Q2246554) (← links)
- The information-theoretic meaning of Gagliardo-Nirenberg type inequalities (Q2319583) (← links)
- On signalling and estimation limits for molecular birth-processes (Q2324920) (← links)
- Some relations between mutual information and estimation error in Wiener space (Q2456053) (← links)
- Estimation of low-rank matrices via approximate message passing (Q2656598) (← links)
- An information-percolation bound for spin synchronization on general graphs (Q2657904) (← links)
- Statistical thresholds for tensor PCA (Q2657928) (← links)
- Parametric Regularity of the Conditional Expectations via the Malliavin Calculus and Applications (Q2841801) (← links)
- Almost Perfect Privacy for Additive Gaussian Privacy Filters (Q2953821) (← links)
- Mutual information for stochastic differential equations driven by fractional Brownian motion (Q3077709) (← links)
- REMARKS ON A SEMICIRCULAR PERTURBATION OF THE FREE FISHER INFORMATION (Q3520353) (← links)
- An extended orthogonal forward regression algorithm for system identification using entropy (Q3543002) (← links)
- Community Detection and Stochastic Block Models (Q4558502) (← links)
- Asymptotic mutual information for the balanced binary stochastic block model (Q4603702) (← links)
- ENTROPY FLOW AND DE BRUIJN'S IDENTITY FOR A CLASS OF STOCHASTIC DIFFERENTIAL EQUATIONS DRIVEN BY FRACTIONAL BROWNIAN MOTION (Q5051912) (← links)
- The adaptive interpolation method for proving replica formulas. Applications to the Curie–Weiss and Wigner spike models (Q5055675) (← links)
- Information theoretic limits of learning a sparse rule (Q5078669) (← links)
- Perturbative construction of mean-field equations in extensive-rank matrix factorization and denoising (Q5101092) (← links)
- Information-Theoretic Bounds and Approximations in Neural Population Coding (Q5157152) (← links)
- A DE BRUIJN'S IDENTITY FOR DEPENDENT RANDOM VARIABLES BASED ON COPULA THEORY (Q5358071) (← links)
- Relations Between Information and Estimation in the Presence of Feedback (Q5497044) (← links)
- Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem (Q6061127) (← links)
- Event-triggered remote state estimation over a collision channel with incomplete information (Q6115799) (← links)