Common Information, Noise Stability, and Their Extensions
From MaRDI portal
Publication:5863763
Recommendations
- Convergence of information, random variables and noise
- The Lossy Common Information of Correlated Sources
- An Analogue of Shannon Information Theory for Detection and Stabilization via Noisy Discrete Communication Channels
- On Exact and ∞-Rényi Common Informations
- Mutual information in stationary channels with additive noise
- Universality and optimality in the information-disturbance tradeoff
- scientific article; zbMATH DE number 762962
- Some new results on information transmission over noisy channels
- Noisy information: optimality, complexity, tractability
- Pointwise Relations Between Information and Estimation in Gaussian Noise
Cites work
- scientific article; zbMATH DE number 3181534 (Why is no real title available?)
- scientific article; zbMATH DE number 3468645 (Why is no real title available?)
- scientific article; zbMATH DE number 3497786 (Why is no real title available?)
- scientific article; zbMATH DE number 3582057 (Why is no real title available?)
- scientific article; zbMATH DE number 665662 (Why is no real title available?)
- scientific article; zbMATH DE number 1158743 (Why is no real title available?)
- scientific article; zbMATH DE number 2069981 (Why is no real title available?)
- scientific article; zbMATH DE number 1821199 (Why is no real title available?)
- scientific article; zbMATH DE number 1416816 (Why is no real title available?)
- scientific article; zbMATH DE number 3019401 (Why is no real title available?)
- $\Phi$ -Entropic Measures of Correlation
- A Lossy Source Coding Interpretation of Wyner’s Common Information
- A Mathematical Theory of Communication
- A Method for the Construction of Minimum-Redundancy Codes
- A Moment Ratio Bound for Polynomials and Some Extremal Properties of Krawchouk Polynomials and Hamming Spheres
- A Note on the Probability of Rectangles for Correlated Binary Strings
- A Variational Characterization of Rényi Divergences
- A conditional entropy bound for a pair of discrete random variables
- A general formula for channel capacity
- A polynomial bound in Freiman's theorem.
- A proof of the data compression theorem of Slepian and Wolf for ergodic sources (Corresp.)
- A theorem on the entropy of certain binary sequences and applications--I
- A two-sided estimate for the Gaussian noise stability deficit
- An almost optimal algorithm for computing nonnegative rank
- Analysis of Boolean Functions
- Analysis of Remaining Uncertainties and Exponents Under Various Conditional Rényi Entropies
- Appendix: On Common Information and Related Characteristics of Correlated Information Sources
- Approximation theory of output statistics
- Asymptotic Coupling and Its Applications in Information Theory
- Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities
- Asymptotic Properties on Codeword Lengths of an Optimal FV Code for General Sources
- Best constants in Young's inequality, its converse, and its generalization to more than three functions
- Boolean Functions: Noise Stability, Non-Interactive Correlation Distillation, and Mutual Information
- Broadcast channels with confidential messages
- Channel Coding Rate in the Finite Blocklength Regime
- Channel Simulation via Interactive Communications
- Coding for computing
- Coin flipping from a cosmic source: On error correction of truly random bits
- Common Information and Secret Key Capacity
- Common randomness and secret key generation with a helper
- Communication for Generating Correlation: A Unifying Survey
- Concentration of measure inequalities in information theory, communications, and coding
- Corrections to “Wyner’s Common Information Under Rényi Divergence Measures” [May 18 3616-3632]
- Dictator functions maximize mutual information
- Distributed Channel Synthesis
- Distributed Simulation of Continuous Random Variables
- Edge-isoperimetric inequalities and ball-noise stability: linear programming and probabilistic approaches
- Efficient Protocols for Generating Bipartite Classical Distributions and Quantum States
- Elements of Information Theory
- Ensembles \(\Lambda(p)\) dans le dual de \(D^\infty\)
- Entanglement-assisted capacity of a quantum channel and the reverse Shannon theorem
- Equivocations, Exponents, and Second-Order Coding Rates Under Various Rényi Information Measures
- Euclidean partitions optimizing noise stability
- Exact Channel Synthesis
- Exponential Decreasing Rate of Leaked Information in Universal Random Privacy Amplification
- Expressing combinatorial optimization problems by linear programs
- Extended Gray–Wyner System With Complementary Causal Side Information
- Fermeture en probabilit� de certains sous-espaces d'un espace L 2
- Functional inequalities for Markov semigroups
- Fundamental Limits of Caching
- General formulas for capacity of classical-quantum channels
- General nonasymptotic and asymptotic formulas in channel resolvability and identification capacity and their application to the wiretap channel
- Generalized cutoff rates and Renyi's information measures
- Generating random bits from an arbitrary source: fundamental limits
- Geometric bounds on the Ornstein-Uhlenbeck velocity process
- Heuristics for exact nonnegative matrix factorization
- Improved log-Sobolev inequalities, hypercontractivity and uncertainty principle on the hypercube
- Information Spectrum Approach to Second-Order Coding Rate in Channel Coding
- Information Theoretic Security for Encryption Based on Conditional Rényi Entropies
- Information radius
- Information theoretic security
- Information theory. Coding theorems for discrete memoryless systems
- Information-Theoretic Caching: Sequential Coding for Computing
- Information-theoretic approximations of the nonnegative rank
- Information-theoretic key agreement: from weak to strong secret for free
- Large deviations, moderate deviations and LIL for empirical processes
- Learning the parts of objects by non-negative matrix factorization
- Logarithmic Sobolev Inequalities
- Low correlation noise stability of symmetric sets
- Lower bounds on nonnegative rank via nonnegative nuclear norms
- Lévy-Gromov's isoperimetric inequality for an infinite dimensional diffusion generator
- Mass transportation problems. Vol. 1: Theory. Vol. 2: Applications
- Maximally stable Gaussian partitions with discrete applications
- Moderate Deviations in Channel Coding
- Noise stability of functions with low influences: invariance and optimality
- Noiseless coding of correlated information sources
- Non-interactive correlation distillation, inhomogeneous Markov chains, and the reverse Bonami-Beckner inequality
- Nonnegative ranks, decompositions, and factorizations of nonnegative matrices
- On Exact and ∞-Rényi Common Informations
- On Extracting Common Random Bits From Correlated Sources
- On Non-Interactive Simulation of Binary Random Variables
- On Non-Interactive Simulation of Joint Distributions
- On Pinsker's and Vajda's Type Inequalities for Csiszár's $f$-Divergences
- On Sequences of Pairs of Dependent Random Variables
- On a reverse form of the Brascamp-Lieb inequality
- On logarithmic Sobolev inequalities. With a preface of Dominique Bakry and Michel Ledoux
- On measures of dependence
- On reverse hypercontractivity
- On the (im)possibility of non-interactive correlation distillation
- On the Entropy of a Noisy Function
- On the Rényi Divergence, Joint Range of Relative Entropies, and a Channel Coding Theorem
- On the complexity of nonnegative matrix factorization
- On the distance distribution of codes
- On the minimum average distance of binary codes: Linear programming approach
- On the nonnegative rank of distance matrices
- Optimal Assignments of Numbers to Vertices
- Overcoming weak expectations
- Physical-layer security. From information theory to security engineering
- Positive semidefinite rank
- Positivity improving operators and hypercontractivity
- Probability and stochastics.
- Quantum Achievability Proof via Collision Relative Entropy
- Real rank versus nonnegative rank
- Robust optimality of Gaussian noise stability
- Rényi Divergence and Kullback-Leibler Divergence
- Rényi Resolvability and Its Applications to the Wiretap Channel
- Rényi divergence and the central limit theorem
- Second-Order Asymptotics in Fixed-Length Source Coding and Intrinsic Randomness
- Simulation of Random Variables Under Rényi Divergence Measures of All Orders
- Simulation of a Channel with Another Channel
- Simulation of random processes and rate-distortion theory
- Source Coding for a Simple Network
- Spreading of sets in product spaces and hypercontraction of the Markov operator
- Strong Data Processing Inequalities and <inline-formula> <tex-math notation="LaTeX">$\Phi $ </tex-math> </inline-formula>-Sobolev Inequalities for Discrete Channels
- Strong Functional Representation Lemma and Applications to Coding Theorems
- Strong Secrecy From Channel Resolvability
- Subadditivity of the entropy and its relation to Brascamp-Lieb type inequalities
- The Communication Complexity of Correlation
- The Lossy Common Information of Correlated Sources
- The Quantum Reverse Shannon Theorem and Resource Tradeoffs for Simulating Quantum Channels
- The Wire-Tap Channel
- The coding of messages subject to chance errors
- The common information of two dependent random variables
- The rate-distortion function for source coding with side information at the decoder
- Third-Order Asymptotics of Variable-Length Compression Allowing Errors
- Values and Bounds for the Common Information of Two Discrete Random Variables
- Variable-Length Compression Allowing Errors
- Weak variable-length source coding
- Which Boolean Functions Maximize Mutual Information on Noisy Inputs?
- Wyner’s Common Information Under Rényi Divergence Measures
- Zero-Error Channel Capacity and Simulation Assisted by Non-Local Correlations
- Étude des coefficients de Fourier des fonctions de \(L^ p(G)\)
Cited in
(5)- Conditional and relevant common information
- The Gray-Wyner Network and Wyner’s Common Information for Gaussian Sources
- The Lossy Common Information of Correlated Sources
- Inducing Information Stability to Obtain Information Theoretic Necessary Requirements
- Universality and optimality in the information-disturbance tradeoff
This page was built for publication: Common Information, Noise Stability, and Their Extensions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5863763)