Tight MMSE Bounds for the AGN Channel Under KL Divergence Constraints on the Input Distribution
From MaRDI portal
Publication:6300853
arXiv1804.10151MaRDI QIDQ6300853FDOQ6300853
Authors: Michael Fauß, Alex Dytso, Abdelhak M. Zoubir, H. Vincent Poor
Publication date: 26 April 2018
Abstract: Tight bounds on the minimum mean square error for the additive Gaussian noise channel are derived, when the input distribution is constrained to be epsilon-close to a Gaussian reference distribution in terms of the Kullback--Leibler divergence. The distributions that attain the bounds are shown be Gaussian whose means are identical to that of the reference distribution and whose covariance matrices are defined implicitly via systems of matrix equations. The estimator that attains the upper bound is identified as a minimax optimal estimator that is robust against deviations from the assumed prior. The lower bound is shown to provide a potentially tighter alternative to the Cramer--Rao bound. Both properties are illustrated with numerical examples.
Statistical aspects of information-theoretic topics (62B10) Bayesian inference (62F15) Nonparametric robustness (62G35) Parametric inference under constraints (62F30)
This page was built for publication: Tight MMSE Bounds for the AGN Channel Under KL Divergence Constraints on the Input Distribution
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6300853)