A historical perspective on Schützenberger-Pinsker inequalities
From MaRDI portal
Publication:6178855
DOI10.1007/978-3-031-38271-0_29MaRDI QIDQ6178855
Publication date: 16 January 2024
Published in: Lecture Notes in Computer Science (Search for Journal in Brave)
total variationmutual informationKullback-Leibler divergencePinsker inequalitydata processing inequalitystatistical distance
Central limit and other weak theorems (60F05) Measures of information, entropy (94A17) Information theory (general) (94A15) Statistical aspects of information-theoretic topics (62B10)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Speed of approach to equilibrium for Kac's caricature of a Maxwellian gas
- Information geometry and its applications
- Entropy and the central limit theorem
- On exponential bounds for binomial probabilities
- A note on exponential bounds for binomial probabilities
- Information Measures: The Curious Case of the Binary Alphabet
- [https://portal.mardi4nfdi.de/wiki/Publication:3048064 Estimation des densit�s: risque minimax]
- Some Limit Theorems for Random Functions. I
- Sharper lower bounds for discrimination information in terms of variation (Corresp.)
- Refinements of Pinsker's inequality
- On the Optimum Rate of Transmitting Information
- On Pinsker's and Vajda's Type Inequalities for Csiszár's $f$-Divergences
- Information, Divergence and Risk for Binary Experiments
- A Note on Hoeffding's Inequality
- Note on discrimination information and variation (Corresp.)
- On Information and Sufficiency
- Information Theory
- Introduction to nonparametric estimation
This page was built for publication: A historical perspective on Schützenberger-Pinsker inequalities