Estimation of entropy-type integral functionals
From MaRDI portal
Publication:2807736
Abstract: Entropy-type integral functionals of densities are widely used in mathematical statistics, information theory, and computer science. Examples include measures of closeness between distributions (e.g., density power divergence) and uncertainty characteristics for a random variable (e.g., R'enyi entropy). In this paper, we study U-statistic estimators for a class of such functionals. The estimators are based on epsilon-close vector observations in the corresponding independent and identically distributed samples. We prove asymptotic properties of the estimators (consistency and asymptotic normality) under mild integrability and smoothness conditions for the densities. The results can be applied in diverse problems in mathematical statistics and computer science (e.g., distribution identification problems, approximate matching for random databases, two-sample problems).
Recommendations
Cites work
- scientific article; zbMATH DE number 47948 (Why is no real title available?)
- scientific article; zbMATH DE number 2221907 (Why is no real title available?)
- A class of Rényi information estimators for multidimensional densities
- A simple adaptive estimator of the integrated square of a density
- Average case analysis in database problems
- Central limit theorems for a class of symmetric statistics
- Decomposable pseudodistances and applications in statistical estimation
- Efficient estimation of integral functionals of a density
- Entropy, divergence and distance measures with econometric applications
- Estimation of Nonlinear Functionals of Densities With Confidence
- Estimation of integral functionals of a density
- Estimation of integral functionals of a density and its derivatives
- Foundations of Modern Probability
- Gaussian limits for generalized spacings
- Generalised two-sample \(U\)-statistics and a two-species reaction- diffusion model
- Goodness of fit tests based on the L2-norm of multivariate probability density functions
- Image matching using alpha-entropy measures and entropic graphs
- Information Theory in Computer Vision and Pattern Recognition
- Information theoretic learning. Renyi's entropy and kernel perspectives
- Limit theorems for a triangular scheme of U-statistics with applications to inter-point distances
- Nonparametric confidence intervals for the integral of a function of an unknown density
- Nonparametric testing of closeness between two unknown distribution functions
- Random databases with approximate record matching
- Robust and efficient estimation by minimising a density power divergence
- Statistical inference for Rényi entropy functionals
- Statistical inference for the \(\epsilon \)-entropy and the quadratic Rényi entropy
- Uniform-in-bandwidth consistency for kernel-type estimators of Shannon's entropy
Cited in
(9)- Estimation of entropy and extropy based on right censored data: a Bayesian non-parametric approach
- Functional calibration estimation by the maximum entropy on the mean principle
- Divergence measures estimation and its asymptotic normality theory in the discrete case
- Bayesian estimation of extropy and goodness of fit tests
- Statistical inference for the \(\epsilon \)-entropy and the quadratic Rényi entropy
- Statistical estimation of quadratic Rényi entropy for a stationary \(m\)-dependent sequence
- AN ENTROPY BASED GLIMM-TYPE FUNCTIONAL
- Estimation of an entropy-based functional
- Statistical inference for Rényi entropy functionals
This page was built for publication: Estimation of entropy-type integral functionals
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2807736)