Common Information, Noise Stability, and Their Extensions
From MaRDI portal
Publication:5863763
DOI10.1561/0100000122zbMath1490.94042OpenAlexW4225340749MaRDI QIDQ5863763
Publication date: 3 June 2022
Published in: Foundations and Trends® in Communications and Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1561/0100000122
information theorycommunication complexityShannon theorymultiuser information theoryinformation theory and statistics
Information theory (general) (94A15) Sampling theory in information and communication theory (94A20) Communication theory (94A05)
Cites Work
- A Mathematical Theory of Communication
- Heuristics for exact nonnegative matrix factorization
- On the nonnegative rank of distance matrices
- A two-sided estimate for the Gaussian noise stability deficit
- Nonnegative ranks, decompositions, and factorizations of nonnegative matrices
- Euclidean partitions optimizing noise stability
- Lower bounds on nonnegative rank via nonnegative nuclear norms
- Positive semidefinite rank
- The coding of messages subject to chance errors
- Noise stability of functions with low influences: invariance and optimality
- Subadditivity of the entropy and its relation to Brascamp-Lieb type inequalities
- Real rank versus nonnegative rank
- On a reverse form of the Brascamp-Lieb inequality
- Positivity improving operators and hypercontractivity
- Expressing combinatorial optimization problems by linear programs
- Best constants in Young's inequality, its converse, and its generalization to more than three functions
- Spreading of sets in product spaces and hypercontraction of the Markov operator
- Large deviations, moderate deviations and LIL for empirical processes
- Mass transportation problems. Vol. 1: Theory. Vol. 2: Applications
- Dictator functions maximize mutual information
- Rényi divergence and the central limit theorem
- Maximally stable Gaussian partitions with discrete applications
- A polynomial bound in Freiman's theorem.
- Lévy-Gromov's isoperimetric inequality for an infinite dimensional diffusion generator
- Information-theoretic approximations of the nonnegative rank
- Robust optimality of Gaussian noise stability
- Edge-isoperimetric inequalities and ball-noise stability: linear programming and probabilistic approaches
- Improved log-Sobolev inequalities, hypercontractivity and uncertainty principle on the hypercube
- On reverse hypercontractivity
- On the (im)possibility of non-interactive correlation distillation
- Non-interactive correlation distillation, inhomogeneous Markov chains, and the reverse Bonami-Beckner inequality
- Ensembles \(\Lambda(p)\) dans le dual de \(D^\infty\)
- Étude des coefficients de Fourier des fonctions de \(L^ p(G)\)
- Low correlation noise stability of symmetric sets
- Concentration of Measure Inequalities in Information Theory, Communications, and Coding
- Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities
- Information Theoretic Security for Encryption Based on Conditional Rényi Entropies
- Information-Theoretic Caching: Sequential Coding for Computing
- On the Entropy of a Noisy Function
- Strong Data Processing Inequalities and <inline-formula> <tex-math notation="LaTeX">$\Phi $ </tex-math> </inline-formula>-Sobolev Inequalities for Discrete Channels
- On Non-Interactive Simulation of Joint Distributions
- A Lossy Source Coding Interpretation of Wyner’s Common Information
- On the Rényi Divergence, Joint Range of Relative Entropies, and a Channel Coding Theorem
- Variable-Length Compression Allowing Errors
- Channel Simulation via Interactive Communications
- Quantum Achievability Proof via Collision Relative Entropy
- Moderate Deviations in Channel Coding
- Which Boolean Functions Maximize Mutual Information on Noisy Inputs?
- Rényi Divergence and Kullback-Leibler Divergence
- The Lossy Common Information of Correlated Sources
- Fundamental Limits of Caching
- The Quantum Reverse Shannon Theorem and Resource Tradeoffs for Simulating Quantum Channels
- Equivocations, Exponents, and Second-Order Coding Rates Under Various Rényi Information Measures
- Probability and Stochastics
- On Sequences of Pairs of Dependent Random Variables
- On measures of dependence
- Geometric bounds on the Ornstein-Uhlenbeck velocity process
- General nonasymptotic and asymptotic formulas in channel resolvability and identification capacity and their application to the wiretap channel
- General formulas for capacity of classical-quantum channels
- Asymptotic Properties on Codeword Lengths of an Optimal FV Code for General Sources
- On the Complexity of Nonnegative Matrix Factorization
- Information Theoretic Security
- Second-Order Asymptotics in Fixed-Length Source Coding and Intrinsic Randomness
- Source Coding for a Simple Network
- A proof of the data compression theorem of Slepian and Wolf for ergodic sources (Corresp.)
- The common information of two dependent random variables
- The Wire-Tap Channel
- Logarithmic Sobolev Inequalities
- The rate-distortion function for source coding with side information at the decoder
- A conditional entropy bound for a pair of discrete random variables
- Values and Bounds for the Common Information of Two Discrete Random Variables
- Broadcast channels with confidential messages
- Approximation theory of output statistics
- A general formula for channel capacity
- Weak variable-length source coding
- Common randomness and secret key generation with a helper
- Coding for computing
- Strong Functional Representation Lemma and Applications to Coding Theorems
- A Variational Characterization of Rényi Divergences
- Distributed Simulation of Continuous Random Variables
- $\Phi$ -Entropic Measures of Correlation
- Asymptotic Coupling and Its Applications in Information Theory
- Rényi Resolvability and Its Applications to the Wiretap Channel
- Entanglement-assisted capacity of a quantum channel and the reverse Shannon theorem
- Extended Gray–Wyner System With Complementary Causal Side Information
- A theorem on the entropy of certain binary sequences and applications--I
- Generalized cutoff rates and Renyi's information measures
- Generating random bits from an arbitrary source: fundamental limits
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the distance distribution of codes
- Simulation of random processes and rate-distortion theory
- An Optimal Lower Bound on the Communication Complexity of Gap-Hamming-Distance
- Overcoming Weak Expectations
- Boolean Functions: Noise Stability, Non-Interactive Correlation Distillation, and Mutual Information
- Information Spectrum Approach to Second-Order Coding Rate in Channel Coding
- The Communication Complexity of Correlation
- On Non-Interactive Simulation of Binary Random Variables
- A Moment Ratio Bound for Polynomials and Some Extremal Properties of Krawchouk Polynomials and Hamming Spheres
- Third-Order Asymptotics of Variable-Length Compression Allowing Errors
- Corrections to “Wyner’s Common Information Under Rényi Divergence Measures” [May 18 3616-3632]
- Exact Channel Synthesis
- On Exact and ∞-Rényi Common Informations
- A Note on the Probability of Rectangles for Correlated Binary Strings
- Analysis of Boolean Functions
- Communication for Generating Correlation: A Unifying Survey
- Simulation of Random Variables Under Rényi Divergence Measures of All Orders
- On Extracting Common Random Bits From Correlated Sources
- Zero-Error Channel Capacity and Simulation Assisted by Non-Local Correlations
- Exponential Decreasing Rate of Leaked Information in Universal Random Privacy Amplification
- On Pinsker's and Vajda's Type Inequalities for Csiszár's $f$-Divergences
- Channel Coding Rate in the Finite Blocklength Regime
- Coin flipping from a cosmic source: On error correction of truly random bits
- Strong Secrecy From Channel Resolvability
- Distributed Channel Synthesis
- Common Information and Secret Key Capacity
- Efficient Protocols for Generating Bipartite Classical Distributions and Quantum States
- Simulation of a Channel with Another Channel
- Learning the parts of objects by non-negative matrix factorization
- Wyner’s Common Information Under Rényi Divergence Measures
- Analysis of Remaining Uncertainties and Exponents Under Various Conditional Rényi Entropies
- Elements of Information Theory
- Appendix: On Common Information and Related Characteristics of Correlated Information Sources
- A Method for the Construction of Minimum-Redundancy Codes
- [https://portal.mardi4nfdi.de/wiki/Publication:5580728 Fermeture en probabilit� de certains sous-espaces d'un espace L 2]
- Information radius
- Optimal Assignments of Numbers to Vertices
- Noiseless coding of correlated information sources
- An Almost Optimal Algorithm for Computing Nonnegative Rank
- Information Theory
- On the minimum average distance of binary codes: Linear programming approach