Robust Hypothesis Testing With a Relative Entropy Tolerance
From MaRDI portal
Publication:4975705
DOI10.1109/TIT.2008.2008128zbMATH Open1367.62018arXiv0707.2926OpenAlexW3099452348MaRDI QIDQ4975705FDOQ4975705
Authors: Bernard C. Levy
Publication date: 8 August 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: This paper considers the design of a minimax test for two hypotheses where the actual probability densities of the observations are located in neighborhoods obtained by placing a bound on the relative entropy between actual and nominal densities. The minimax problem admits a saddle point which is characterized. The robust test applies a nonlinear transformation which flattens the nominal likelihood ratio in the vicinity of one. Results are illustrated by considering the transmission of binary data in the presence of additive noise.
Full work available at URL: https://arxiv.org/abs/0707.2926
Statistical aspects of information-theoretic topics (62B10) Nonparametric hypothesis testing (62G10) Nonparametric robustness (62G35) Measures of information, entropy (94A17)
Cited In (4)
This page was built for publication: Robust Hypothesis Testing With a Relative Entropy Tolerance
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4975705)