Distribution estimation consistent in total variation and in two types of information divergence
From MaRDI portal
Publication:4014167
DOI10.1109/18.149496zbMath0765.62007OpenAlexW2107676886MaRDI QIDQ4014167
Andrew R. Barron, László Györfi, Edward C. van der Meulen
Publication date: 12 October 1992
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/ec64cd6ee30f41e5f507d8213ec2df9d0e607089
convergence criteriaconsistency in information divergencehistogram-based estimatorsa priori informationconsistency in reversed order information divergenceconsistency in total variation
Density estimation (62G07) Asymptotic properties of nonparametric inference (62G20) Statistical aspects of information-theoretic topics (62B10)
Related Items
Divergence-type errors of smooth Barron-type density estimators., A new class of metric divergences on probability spaces and its applicability in statistics, Parameter selection in modified histogram estimates, Parameter identifiability with Kullback-Leibler information divergence criterion, Iterated modified histograms as dynamical systems, Cross-validated density estimates based on Kullback–Leibler information, Mutual information, metric entropy and cumulative relative entropy risk, Multivariate density estimation from privatised data: universal consistency and minimax rates, Predictive Inference Based on Markov Chain Monte Carlo Output, Optimal shrinkage estimation of predictive densities under \(\alpha\)-divergences, Bayesian adaptation, On the convergence of Shannon differential entropy, and its connections with density and entropy estimation, Strongly consistent nonparametric tests of conditional independence, Decomposition of Kullback-Leibler risk and unbiasedness for parameter-free estimators, Divergence for \(s\)-concave and log concave functions, Selecting iterated modified histograms, Fast adaptive estimation of log-additive exponential models in Kullback-Leibler divergence, Multivariate Density Estimation by Bayesian Sequential Partitioning, How to get central limit theorems for global errors of estimates., A Steiner formula in the \(L_p\) Brunn Minkowski theory, Unnamed Item, Information divergence estimation based on data-dependent partitions, A maximum smoothed likelihood estimator in the current status continuous mark model, A maximum smoothed likelihood estimator in the current status continuous mark model, Unnamed Item, Orlicz Addition for Measures and an Optimization Problem for the -divergence, Unnamed Item, Asymptotic unbiased density estimators, On the lrerror in histogram density estimation: The multidimensional case, Mixing strategies for density estimation., Distribution Estimates Consistent in χ2-Divergence, Unnamed Item