Second-order converses via reverse hypercontractivity
DOI10.4171/MSL/13zbMath1460.62208arXiv1812.10129MaRDI QIDQ778887
Sergio Verdú, Liu Jingbo, Ramon van Handel
Publication date: 20 July 2020
Published in: Mathematical Statistics and Learning (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1812.10129
concentration of measureblowing-up lemmainformation-theoretic inequalitiesreverse hypercontractivitystrong converse
Inequalities; stochastic orderings (60E15) Markov semigroups and applications to diffusion processes (47D07) Coding and information theory (compaction, compression, models of communication, encoding schemes, etc.) (aspects in computer science) (68P30) Information theory (general) (94A15) Statistical aspects of information-theoretic topics (62B10) Coding theorems (Shannon theory) (94A24) Applications of operator theory in probability theory and statistics (47N30) Statistical aspects of big data and data science (62R07)
Related Items (2)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Strong converse of the coding theorem for semicontinuous channels
- Positivity improving operators and hypercontractivity
- Hypercontractivity of Hamilton-Jacobi equations.
- On reverse hypercontractivity
- Non-interactive correlation distillation, inhomogeneous Markov chains, and the reverse Bonami-Beckner inequality
- A User's Guide to Measure Theoretic Probability
- Concentration of Measure Inequalities in Information Theory, Communications, and Coding
- Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities
- Nonasymptotic and Second-Order Achievability Bounds for Coding With Side-Information
- Network Information Theory
- Large-Sample Theory: Parametric Case
- Certain results in coding theory for noisy channels
- Mutual Information and Minimum Mean-Square Error in Gaussian Channels
- A simple proof of the blowing-up lemma (Corresp.)
- A proof of Marton's coding theorem for the discrete memoryless broadcast channel (Corresp.)
- On source coding with side information at the decoder
- Asymptotic evaluation of certain Markov process expectations for large time—III
- Correction to bounds on conditional probabilities with applications
- Approximation theory of output statistics
- A general formula for channel capacity
- The empirical distribution of good codes
- A Proof of the Strong Converse Theorem for Gaussian Broadcast Channels via the Gaussian Poincaré Inequality
- Hypothesis testing with communication constraints
- Information Spectrum Approach to Second-Order Coding Rate in Channel Coding
- Smoothing Brascamp-Lieb Inequalities and Strong Converses of Coding Theorems
- Logarithmic Sobolev inequalities in discrete product spaces
- Fixed-Length Lossy Compression in the Finite Blocklength Regime
- Channel Coding Rate in the Finite Blocklength Regime
- Optimal Lossless Data Compression: Non-Asymptotics and Asymptotics
- Empirical Distribution of Good Channel Codes With Nonvanishing Error Probability
- Elements of Information Theory
- Notes on a general strong converse
- Lower bounds to error probability for coding on discrete memoryless channels. I
- Information Theory
- Approximate tensorization of entropy at high temperature
This page was built for publication: Second-order converses via reverse hypercontractivity