Common Information, Noise Stability, and Their Extensions
DOI10.1561/0100000122zbMATH Open1490.94042OpenAlexW4225340749MaRDI QIDQ5863763FDOQ5863763
Authors: Lei Yu, Vincent Y. F. Tan
Publication date: 3 June 2022
Published in: Foundations and Trends™ in Communications and Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1561/0100000122
Recommendations
- Convergence of information, random variables and noise
- The Lossy Common Information of Correlated Sources
- An Analogue of Shannon Information Theory for Detection and Stabilization via Noisy Discrete Communication Channels
- On Exact and ∞-Rényi Common Informations
- Mutual information in stationary channels with additive noise
- Universality and optimality in the information-disturbance tradeoff
- scientific article; zbMATH DE number 762962
- Some new results on information transmission over noisy channels
- Noisy information: optimality, complexity, tractability
- Pointwise Relations Between Information and Estimation in Gaussian Noise
information theorycommunication complexityShannon theorymultiuser information theoryinformation theory and statistics
Communication theory (94A05) Information theory (general) (94A15) Sampling theory in information and communication theory (94A20)
Cites Work
- On Extracting Common Random Bits From Correlated Sources
- Dictator functions maximize mutual information
- Which Boolean Functions Maximize Mutual Information on Noisy Inputs?
- General nonasymptotic and asymptotic formulas in channel resolvability and identification capacity and their application to the wiretap channel
- Coin flipping from a cosmic source: On error correction of truly random bits
- Simulation of random processes and rate-distortion theory
- On Non-Interactive Simulation of Joint Distributions
- On Non-Interactive Simulation of Binary Random Variables
- A Moment Ratio Bound for Polynomials and Some Extremal Properties of Krawchouk Polynomials and Hamming Spheres
- Strong Secrecy From Channel Resolvability
- Physical-layer security. From information theory to security engineering
- A Variational Characterization of Rényi Divergences
- Communication for Generating Correlation: A Unifying Survey
- $\Phi$ -Entropic Measures of Correlation
- Distributed Simulation of Continuous Random Variables
- On the Rényi Divergence, Joint Range of Relative Entropies, and a Channel Coding Theorem
- Improved log-Sobolev inequalities, hypercontractivity and uncertainty principle on the hypercube
- Information Theoretic Security for Encryption Based on Conditional Rényi Entropies
- Asymptotic Coupling and Its Applications in Information Theory
- Information-Theoretic Caching: Sequential Coding for Computing
- A Lossy Source Coding Interpretation of Wyner’s Common Information
- Variable-Length Compression Allowing Errors
- Channel Simulation via Interactive Communications
- The Lossy Common Information of Correlated Sources
- Fundamental Limits of Caching
- Equivocations, Exponents, and Second-Order Coding Rates Under Various Rényi Information Measures
- Boolean Functions: Noise Stability, Non-Interactive Correlation Distillation, and Mutual Information
- Common Information and Secret Key Capacity
- Asymptotic Properties on Codeword Lengths of an Optimal FV Code for General Sources
- A Note on the Probability of Rectangles for Correlated Binary Strings
- Generating random bits from an arbitrary source: fundamental limits
- Weak variable-length source coding
- Strong Functional Representation Lemma and Applications to Coding Theorems
- Rényi Resolvability and Its Applications to the Wiretap Channel
- Extended Gray–Wyner System With Complementary Causal Side Information
- Simulation of Random Variables Under Rényi Divergence Measures of All Orders
- Wyner’s Common Information Under Rényi Divergence Measures
- Exact Channel Synthesis
- On Exact and ∞-Rényi Common Informations
- Distributed Channel Synthesis
- Simulation of a Channel with Another Channel
- Analysis of Remaining Uncertainties and Exponents Under Various Conditional Rényi Entropies
- Elements of Information Theory
- A Mathematical Theory of Communication
- On measures of dependence
- On the complexity of nonnegative matrix factorization
- Mass transportation problems. Vol. 1: Theory. Vol. 2: Applications
- Learning the parts of objects by non-negative matrix factorization
- Probability and stochastics.
- Functional inequalities for Markov semigroups
- Title not available (Why is that?)
- Title not available (Why is that?)
- Real rank versus nonnegative rank
- Expressing combinatorial optimization problems by linear programs
- Logarithmic Sobolev Inequalities
- Approximation theory of output statistics
- Title not available (Why is that?)
- Analysis of Boolean Functions
- Exponential Decreasing Rate of Leaked Information in Universal Random Privacy Amplification
- Optimal Assignments of Numbers to Vertices
- Nonnegative ranks, decompositions, and factorizations of nonnegative matrices
- On logarithmic Sobolev inequalities. With a preface of Dominique Bakry and Michel Ledoux
- On the distance distribution of codes
- Title not available (Why is that?)
- On a reverse form of the Brascamp-Lieb inequality
- Noise stability of functions with low influences: invariance and optimality
- Heuristics for exact nonnegative matrix factorization
- On the nonnegative rank of distance matrices
- Generalized cutoff rates and Renyi's information measures
- A Method for the Construction of Minimum-Redundancy Codes
- Lévy-Gromov's isoperimetric inequality for an infinite dimensional diffusion generator
- Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities
- Rényi Divergence and Kullback-Leibler Divergence
- General formulas for capacity of classical-quantum channels
- Second-Order Asymptotics in Fixed-Length Source Coding and Intrinsic Randomness
- The rate-distortion function for source coding with side information at the decoder
- Broadcast channels with confidential messages
- Information Spectrum Approach to Second-Order Coding Rate in Channel Coding
- Channel Coding Rate in the Finite Blocklength Regime
- Information theory. Coding theorems for discrete memoryless systems
- Étude des coefficients de Fourier des fonctions de \(L^ p(G)\)
- The common information of two dependent random variables
- Title not available (Why is that?)
- Lower bounds on nonnegative rank via nonnegative nuclear norms
- Positive semidefinite rank
- Maximally stable Gaussian partitions with discrete applications
- Robust optimality of Gaussian noise stability
- Low correlation noise stability of symmetric sets
- Geometric bounds on the Ornstein-Uhlenbeck velocity process
- Overcoming weak expectations
- A two-sided estimate for the Gaussian noise stability deficit
- Euclidean partitions optimizing noise stability
- Quantum Achievability Proof via Collision Relative Entropy
- The Quantum Reverse Shannon Theorem and Resource Tradeoffs for Simulating Quantum Channels
- A general formula for channel capacity
- Entanglement-assisted capacity of a quantum channel and the reverse Shannon theorem
- Positivity improving operators and hypercontractivity
- Information-theoretic approximations of the nonnegative rank
- Non-interactive correlation distillation, inhomogeneous Markov chains, and the reverse Bonami-Beckner inequality
- Values and Bounds for the Common Information of Two Discrete Random Variables
- The Communication Complexity of Correlation
- Efficient Protocols for Generating Bipartite Classical Distributions and Quantum States
- Noiseless coding of correlated information sources
- Best constants in Young's inequality, its converse, and its generalization to more than three functions
- A polynomial bound in Freiman's theorem.
- Information-theoretic key agreement: from weak to strong secret for free
- The Wire-Tap Channel
- Subadditivity of the entropy and its relation to Brascamp-Lieb type inequalities
- Coding for computing
- On Pinsker's and Vajda's Type Inequalities for Csiszár's $f$-Divergences
- On the (im)possibility of non-interactive correlation distillation
- Zero-Error Channel Capacity and Simulation Assisted by Non-Local Correlations
- Title not available (Why is that?)
- Title not available (Why is that?)
- Spreading of sets in product spaces and hypercontraction of the Markov operator
- On reverse hypercontractivity
- On Sequences of Pairs of Dependent Random Variables
- Concentration of measure inequalities in information theory, communications, and coding
- Title not available (Why is that?)
- Title not available (Why is that?)
- A theorem on the entropy of certain binary sequences and applications--I
- Large deviations, moderate deviations and LIL for empirical processes
- Rényi divergence and the central limit theorem
- Corrections to “Wyner’s Common Information Under Rényi Divergence Measures” [May 18 3616-3632]
- Appendix: On Common Information and Related Characteristics of Correlated Information Sources
- Common randomness and secret key generation with a helper
- Third-Order Asymptotics of Variable-Length Compression Allowing Errors
- Information radius
- An almost optimal algorithm for computing nonnegative rank
- The coding of messages subject to chance errors
- On the Entropy of a Noisy Function
- Strong Data Processing Inequalities and <inline-formula> <tex-math notation="LaTeX">$\Phi $ </tex-math> </inline-formula>-Sobolev Inequalities for Discrete Channels
- On the minimum average distance of binary codes: Linear programming approach
- Information theoretic security
- Fermeture en probabilit� de certains sous-espaces d'un espace L 2
- A proof of the data compression theorem of Slepian and Wolf for ergodic sources (Corresp.)
- Title not available (Why is that?)
- Ensembles \(\Lambda(p)\) dans le dual de \(D^\infty\)
- Source Coding for a Simple Network
- A conditional entropy bound for a pair of discrete random variables
- Edge-isoperimetric inequalities and ball-noise stability: linear programming and probabilistic approaches
- Moderate Deviations in Channel Coding
Cited In (5)
- Conditional and relevant common information
- The Gray-Wyner Network and Wyner’s Common Information for Gaussian Sources
- The Lossy Common Information of Correlated Sources
- Inducing Information Stability to Obtain Information Theoretic Necessary Requirements
- Universality and optimality in the information-disturbance tradeoff
This page was built for publication: Common Information, Noise Stability, and Their Extensions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5863763)