Operational Interpretation of Rényi Information Measures via Composite Hypothesis Testing Against Product and Markov Distributions
From MaRDI portal
Publication:4566686
DOI10.1109/TIT.2017.2776900zbMATH Open1390.94644DBLPjournals/tit/TomamichelH18arXiv1511.04874WikidataQ60026368 ScholiaQ60026368MaRDI QIDQ4566686FDOQ4566686
Authors: Marco Tomamichel, Masahito Hayashi
Publication date: 27 June 2018
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: We revisit the problem of asymmetric binary hypothesis testing against a composite alternative hypothesis. We introduce a general framework to treat such problems when the alternative hypothesis adheres to certain axioms. In this case we find the threshold rate, the optimal error and strong converse exponents (at large deviations from the threshold) and the second order asymptotics (at small deviations from the threshold). We apply our results to find operational interpretations of various Renyi information measures. In case the alternative hypothesis is comprised of bipartite product distributions, we find that the optimal error and strong converse exponents are determined by variations of Renyi mutual information. In case the alternative hypothesis consists of tripartite distributions satisfying the Markov property, we find that the optimal exponents are determined by variations of Renyi conditional mutual information. In either case the relevant notion of Renyi mutual information depends on the precise choice of the alternative hypothesis. As such, our work also strengthens the view that different definitions of Renyi mutual information, conditional entropy and conditional mutual information are adequate depending on the context in which the measures are used.
Full work available at URL: https://arxiv.org/abs/1511.04874
Recommendations
- Correlation detection and an operational interpretation of the Rényi mutual information
- Rényi information, loglikelihood and an intrinsic distribution measure
- Significance testing of information theoretic functionals
- Generalized cutoff rates and Renyi's information measures
- scientific article
- Statistical inference for Rényi entropy functionals
- Information Inequalities for Joint Distributions, With Interpretations and Applications
- An operational characterization of mutual information in algorithmic information theory
- An operational characterization of mutual information in algorithmic information theory
Cited In (5)
- Bounds for smooth min- and max-entropy
- A fluctuation theory of communications
- Decomposition rules for quantum Rényi mutual information with an application to information exclusion relations
- Operational interpretation of the sandwiched Rényi divergence of order 1/2 to 1 as strong converse exponents
- On composite quantum hypothesis testing
This page was built for publication: Operational Interpretation of Rényi Information Measures via Composite Hypothesis Testing Against Product and Markov Distributions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4566686)