Convergence properties of functional estimates for discrete distributions

From MaRDI portal
Publication:2772919

DOI10.1002/rsa.10019zbMath0985.62006OpenAlexW2101985079MaRDI QIDQ2772919

András Antos, Ioannis Kontoyiannis

Publication date: 23 May 2002

Published in: Random Structures and Algorithms (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1002/rsa.10019




Related Items (30)

Identifying anomalous signals in GPS data using HMMs: an increased likelihood of earthquakes?Entropy, mutual information, and systematic measures of structured spiking neural networksA quantum-mechanical derivation of the multivariate central limit theorem for Markov chainsEstimation of Entropy and Mutual InformationEntropy Estimation in Turing's PerspectiveMutual information in the frequency domainBIAS REDUCTION OF THE NEAREST NEIGHBOR ENTROPY ESTIMATORCoincidences and estimation of entropies of random variables with large cardinalitiesA nonparametric two‐sample test using a general φ‐divergence‐based mutual informationNear-Optimal Learning of Tree-Structured Distributions by Chow and LiuLimit distributions and sensitivity analysis for empirical entropic optimal transport on countable spacesOptimal non-asymptotic concentration of centered empirical relative entropy in the high-dimensional regimeGaussian concentration bounds for stochastic chains of unbounded memoryAssessing the dependence structure of the components of hybrid time series processes using mutual informationMethods for diversity and overlap analysis in T-cell receptor populationsA Bernstein-von Mises theorem for discrete probability distributionsThe resampling of entropies with the application of biodiversityA Note on Entropy EstimationDeviation inequalities for separately Lipschitz functionals of iterated random functionsCausality and Bayesian network PDEs for multiscale representations of porous mediaLarge-sample asymptotic approximations for the sampling and posterior distributions of differential entropy for multivariate normal distributionsLearning and generalization with the information bottleneckInformation in the Nonstationary CaseOn convergence properties of Shannon entropyBias adjustment for a nonparametric entropy estimatorTsallis conditional mutual information in investigating long range correlation in symbol sequencesEstimating entropy rate from censored symbolic time series: A test for time-irreversibilityOn entropy estimation for distributions with countable support.Unnamed ItemUnnamed Item



Cites Work


This page was built for publication: Convergence properties of functional estimates for discrete distributions