Concentration of the information in data with log-concave distributions
From MaRDI portal
Publication:717889
DOI10.1214/10-AOP592zbMath1227.60043arXiv1012.5457OpenAlexW1991757114MaRDI QIDQ717889
Mokshay Madiman, Sergey G. Bobkov
Publication date: 10 October 2011
Published in: The Annals of Probability (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1012.5457
entropyconcentrationlog-concave distributionsShannon-McMillan-Breiman theoremasymptotic equipartition property
Related Items (29)
SOME MEASURES INFORMATION FOR GENERALIZED AND q-GENERALIZED EXTREME VALUES AND ITS PROPERTIES ⋮ PRESERVATION OF LOG-CONCAVITY UNDER CONVOLUTION ⋮ Hyperbolic measures on infinite dimensional spaces ⋮ PRESERVATION OF LOG-CONCAVITY AND LOG-CONVEXITY UNDER OPERATORS ⋮ ANALYSIS AND APPLICATIONS OF THE RESIDUAL VARENTROPY OF RANDOM LIFETIMES ⋮ A note on transportation cost inequalities for diffusions with reflections ⋮ On the log-concavity of a Jacobi theta function ⋮ A multivariate Gnedenko law of large numbers ⋮ Varentropy of order statistics and some stochastic comparisons ⋮ Concentration of information content for convex measures ⋮ Optimal Concentration of Information Content for Log-Concave Densities ⋮ Reverse Brunn-Minkowski and reverse entropy power inequalities for convex measures ⋮ A discrete complement of Lyapunov's inequality and its information theoretic consequences ⋮ Varentropy estimators with applications in testing uniformity ⋮ Geometric and functional inequalities for log-concave probability sequences ⋮ Dimensional behaviour of entropy and information ⋮ Concentration of the Intrinsic Volumes of a Convex Body ⋮ A NEW GENERALIZED VARENTROPY AND ITS PROPERTIES ⋮ Dimensional variance inequalities of Brascamp-Lieb type and a local approach to dimensional Prékopa's theorem ⋮ Maximum Likelihood Estimation of Regularization Parameters in High-Dimensional Inverse Problems: An Empirical Bayesian Approach Part I: Methodology and Experiments ⋮ Log-concavity and strong log-concavity: a review ⋮ A Distributed Framework for the Construction of Transport Maps ⋮ Regularization under diffusion and anticoncentration of the information content ⋮ On the exponentially weighted aggregate with the Laplace prior ⋮ A Combinatorial Approach to Small Ball Inequalities for Sums and Differences ⋮ Talagrand concentration inequalities for stochastic partial differential equations ⋮ One-dimensional empirical measures, order statistics, and Kantorovich transport distances ⋮ Varentropy of past lifetimes ⋮ Maximum-a-Posteriori Estimation with Bayesian Confidence Regions
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Generalizations of Shannon-McMillan theorem
- The strong ergodic theorem for densities: Generalized Shannon-McMillan- Breiman theorem
- A sandwich proof of the Shannon-McMillan-Breiman theorem
- Convex measures on locally convex spaces
- A simple proof of the Moy-Perez generalization of the Shannon-McMillan theorem
- Isoperimetric problems for convex bodies and a localization lemma
- Extremal properties of half-spaces for log-concave distributions
- Moment inequalities of Polya frequency functions
- Complements of Lyapunov's inequality
- Geometry of log-concave functions and measures
- Correction Notes: Correction to "The Individual Ergodic Theorem of Information Theory"
- Gaussian feedback capacity
- Random walks in a convex body and an improved volume algorithm
- The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions
- Properties of Probability Distributions with Monotone Hazard Rate
- The Basic Theorems of Information Theory
This page was built for publication: Concentration of the information in data with log-concave distributions