Convergence properties of functional estimates for discrete distributions
From MaRDI portal
Publication:2772919
DOI10.1002/rsa.10019zbMath0985.62006OpenAlexW2101985079MaRDI QIDQ2772919
András Antos, Ioannis Kontoyiannis
Publication date: 23 May 2002
Published in: Random Structures and Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1002/rsa.10019
Related Items (30)
Identifying anomalous signals in GPS data using HMMs: an increased likelihood of earthquakes? ⋮ Entropy, mutual information, and systematic measures of structured spiking neural networks ⋮ A quantum-mechanical derivation of the multivariate central limit theorem for Markov chains ⋮ Estimation of Entropy and Mutual Information ⋮ Entropy Estimation in Turing's Perspective ⋮ Mutual information in the frequency domain ⋮ BIAS REDUCTION OF THE NEAREST NEIGHBOR ENTROPY ESTIMATOR ⋮ Coincidences and estimation of entropies of random variables with large cardinalities ⋮ A nonparametric two‐sample test using a general φ‐divergence‐based mutual information ⋮ Near-Optimal Learning of Tree-Structured Distributions by Chow and Liu ⋮ Limit distributions and sensitivity analysis for empirical entropic optimal transport on countable spaces ⋮ Optimal non-asymptotic concentration of centered empirical relative entropy in the high-dimensional regime ⋮ Gaussian concentration bounds for stochastic chains of unbounded memory ⋮ Assessing the dependence structure of the components of hybrid time series processes using mutual information ⋮ Methods for diversity and overlap analysis in T-cell receptor populations ⋮ A Bernstein-von Mises theorem for discrete probability distributions ⋮ The resampling of entropies with the application of biodiversity ⋮ A Note on Entropy Estimation ⋮ Deviation inequalities for separately Lipschitz functionals of iterated random functions ⋮ Causality and Bayesian network PDEs for multiscale representations of porous media ⋮ Large-sample asymptotic approximations for the sampling and posterior distributions of differential entropy for multivariate normal distributions ⋮ Learning and generalization with the information bottleneck ⋮ Information in the Nonstationary Case ⋮ On convergence properties of Shannon entropy ⋮ Bias adjustment for a nonparametric entropy estimator ⋮ Tsallis conditional mutual information in investigating long range correlation in symbol sequences ⋮ Estimating entropy rate from censored symbolic time series: A test for time-irreversibility ⋮ On entropy estimation for distributions with countable support. ⋮ Unnamed Item ⋮ Unnamed Item
Cites Work
- Unnamed Item
- Unnamed Item
- An Efron-Stein inequality for nonsymmetric statistics
- Asymptotic recurrence and waiting times for stationary processes
- Another proof of a slow convergence result of Birgé
- On estimating a density using Hellinger distance and some other strange facts
- Some asymptotic properties of the entropy of a stationary ergodic data source with applications to data compression
- Any Discrimination Rule Can Have an Arbitrarily Bad Probability of Error for Finite Sample Size
- On arbitrarily slow rates of global convergence in density estimation
- A universal algorithm for sequential data compression
- Compression of individual sequences via variable-rate coding
- Nonparametric entropy estimation for stationary processes and random fields, with applications to English text
- A sharp concentration inequality with applications
- Entropy estimation of symbol sequences
- Fifty years of Shannon theory
- Inequalities for the $r$th Absolute Moment of a Sum of Random Variables, $1 \leqq r \leqq 2$
This page was built for publication: Convergence properties of functional estimates for discrete distributions