Concentration of the information in data with log-concave distributions
From MaRDI portal
Publication:717889
DOI10.1214/10-AOP592zbMATH Open1227.60043arXiv1012.5457OpenAlexW1991757114MaRDI QIDQ717889FDOQ717889
Authors: Mokshay Madiman, S. G. Bobkov
Publication date: 10 October 2011
Published in: The Annals of Probability (Search for Journal in Brave)
Abstract: A concentration property of the functional is demonstrated, when a random vector X has a log-concave density f on . This concentration property implies in particular an extension of the Shannon-McMillan-Breiman strong ergodic theorem to the class of discrete-time stochastic processes with log-concave marginals.
Full work available at URL: https://arxiv.org/abs/1012.5457
Recommendations
- Optimal concentration of information content for log-concave densities
- Concentration of information content for convex measures
- scientific article; zbMATH DE number 868060
- Information inequalities and concentration of measure
- Concentration functions and entropy bounds for discrete log-concave distributions
- Dimension-free information concentration via exp-concavity
- Concentration inequalities for ultra log-concave distributions
- Intrinsic Entropies of Log-Concave Distributions
- Entropy and information jump for log-concave vectors
entropyconcentrationlog-concave distributionsShannon-McMillan-Breiman theoremasymptotic equipartition property
Cites Work
- A Mathematical Theory of Communication
- A sandwich proof of the Shannon-McMillan-Breiman theorem
- Convex measures on locally convex spaces
- Isoperimetric problems for convex bodies and a localization lemma
- Extremal properties of half-spaces for log-concave distributions
- Random walks in a convex body and an improved volume algorithm
- Title not available (Why is that?)
- Properties of Probability Distributions with Monotone Hazard Rate
- The Basic Theorems of Information Theory
- Title not available (Why is that?)
- The strong ergodic theorem for densities: Generalized Shannon-McMillan- Breiman theorem
- A simple proof of the Moy-Perez generalization of the Shannon-McMillan theorem
- Geometry of log-concave functions and measures
- The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions
- Moment inequalities of Polya frequency functions
- Complements of Lyapunov's inequality
- Correction Notes: Correction to "The Individual Ergodic Theorem of Information Theory"
- Gaussian feedback capacity
- Title not available (Why is that?)
- Generalizations of Shannon-McMillan theorem
Cited In (40)
- Concentration functions and entropy bounds for discrete log-concave distributions
- Reverse Brunn-Minkowski and reverse entropy power inequalities for convex measures
- A discrete complement of Lyapunov's inequality and its information theoretic consequences
- Chernoff's density is log-concave
- Varentropy of order statistics and some stochastic comparisons
- Dimensional behaviour of entropy and information
- Varentropy estimators with applications in testing uniformity
- Geometric and functional inequalities for log-concave probability sequences
- On stochastic properties of past varentropy with applications.
- Maximum-a-posteriori estimation with Bayesian confidence regions
- Title not available (Why is that?)
- A NEW GENERALIZED VARENTROPY AND ITS PROPERTIES
- SOME MEASURES INFORMATION FOR GENERALIZED AND q-GENERALIZED EXTREME VALUES AND ITS PROPERTIES
- Hyperbolic measures on infinite dimensional spaces
- Talagrand concentration inequalities for stochastic partial differential equations
- Regularization under diffusion and anticoncentration of the information content
- On the \(s\)-Gaussian measure in \(\mathbb{R}^n\)
- Aging notions, stochastic orders, and expected utilities
- Optimal concentration of information content for log-concave densities
- A note on transportation cost inequalities for diffusions with reflections
- Dimension-free information concentration via exp-concavity
- On the log-concavity of a Jacobi theta function
- ANALYSIS AND APPLICATIONS OF THE RESIDUAL VARENTROPY OF RANDOM LIFETIMES
- Concentration of information content for convex measures
- Stochastic properties of varentropy based on residual lifetime
- PRESERVATION OF LOG-CONCAVITY AND LOG-CONVEXITY UNDER OPERATORS
- A combinatorial approach to small ball inequalities for sums and differences
- Maximum likelihood estimation of regularization parameters in high-dimensional inverse problems: an empirical Bayesian approach. I: Methodology and experiments
- Log-concavity and strong log-concavity: a review
- On the exponentially weighted aggregate with the Laplace prior
- Varentropy estimators applied to test of fit for inverse Gaussian distribution
- Varentropy estimators applied to goodness of fit tests for the Gumbel distribution
- Concentration of the intrinsic volumes of a convex body
- Dimensional variance inequalities of Brascamp-Lieb type and a local approach to dimensional Prékopa's theorem
- Varentropy of past lifetimes
- One-dimensional empirical measures, order statistics, and Kantorovich transport distances
- A multivariate Gnedenko law of large numbers
- A distributed framework for the construction of transport maps
- An ergodic theorem for constrained sequences of functions
- Preservation of log-concavity under convolution
This page was built for publication: Concentration of the information in data with log-concave distributions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q717889)