Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem
From MaRDI portal
Publication:6061127
DOI10.1007/978-3-031-12244-6_26zbMath1530.94011arXiv2204.05033OpenAlexW4376624861MaRDI QIDQ6061127
Ioannis Kontoyiannis, Lampros Gavalakis
Publication date: 3 December 2023
Published in: Lecture Notes in Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2204.05033
Measures of information, entropy (94A17) Information theory (general) (94A15) Statistical aspects of fuzziness, sufficiency, and information (62B86)
Cites Work
- A Mathematical Theory of Communication
- Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
- Compound Poisson approximation via information functionals
- Sanov property, generalized I-projection and a conditional limit theorem
- Entropy production by block variable summation and central limit theorems
- A simple proof of Sanov's theorem
- Log-concavity and the maximum entropy property of the Poisson distribution
- Convergence of Markov chains in information divergence
- Subadditivity of the entropy and its relation to Brascamp-Lieb type inequalities
- Ultracontractivity and the heat kernel for Schrödinger operators and Dirichlet Laplacians
- Entropy and the central limit theorem
- Large deviation theorems for empirical probability measures
- Finite exchangeable sequences
- I-divergence geometry of probability distributions and minimization problems
- Finite forms of de Finetti's theorem on exchangeability
- On modified logarithmic Sobolev inequalities for Bernoulli and Poisson measures
- Exponential integrability and transportation cost related to logarithmic Sobolev inequalities
- The analogues of entropy and of Fisher's information measure in free probability theory. I
- The analogues of entropy and of Fisher's information measure in free probability theory. II
- Entropy and convergence on compact groups
- The interplay of Bayesian and frequentist analysis
- Entropy inequalities and the central limit theorem.
- Fisher information inequalities and the central limit theorem
- The analogues of entropy and of Fisher's information measure in free probability theory. III: The absence of Cartan subalgebras
- An information-theoretic proof of a finite de Finetti theorem
- Information theory and the limit-theorem for Markov chains and processes with a countable infinity of states
- Bernoulli shifts with the same entropy are isomorphic
- Two Bernoulli shifts with infinite entropy are isomorphic
- Bounding \(\bar d\)-distance by informational divergence: A method to prove measure concentration
- Concentration of Measure Inequalities in Information Theory, Communications, and Coding
- Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information
- On Talagrand's deviation inequalities for product measures
- On the similarity of the entropy power inequality and the Brunn- Minkowski inequality (Corresp.)
- Symmetric Measures on Cartesian Products
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
- On a property of normal distributions of any stochastic process
- An Information-Theoretic Proof of the Central Limit Theorem with Lindeberg Conditions
- Mutual Information and Minimum Mean-Square Error in Gaussian Channels
- Entropy and the Law of Small Numbers
- Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof
- Generalized Entropy Power Inequalities and Monotonicity Properties of Information
- Conditional limit theorems under Markov conditioning
- Information theoretic inequalities
- Large deviation theorems for empirical types of Markov chains constrained to thin sets
- Logarithmic Sobolev Inequalities
- Sul significato soggettivo della probabilità
- Binomial and Poisson distributions as maximum entropy distributions
- Entropy Bounds on Abelian Groups and the Ruzsa Divergence
- Entropic CLT and Phase Transition in High-dimensional Wishart Matrices
- The method of types [information theory]
- On the Optimum Rate of Transmitting Information
- Solution of Shannon’s problem on the monotonicity of entropy
- Sumset and Inverse Sumset Theory for Shannon Entropy
- On the Entropy of Compound Distributions on Nonnegative Integers
- Thinning, Entropy, and the Law of Thin Numbers
- The convolution inequality for entropy powers
- Elements of Information Theory
- Optimal Transport
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem