On the Estimation of Functionals of the Probability Density and Its Derivatives
From MaRDI portal
Publication:4055007
Cited in
(27)- Entropy-based guidance of deep neural networks for accelerated convergence and improved performance
- Sequential convex programming for computing information-theoretic minimal partitions: nonconvex nonsmooth optimization
- Extropy estimators with applications in testing uniformity
- A new estimator of entropy
- Nonparametric entropy estimation of conditional distribution under length-biased right censored sample
- Limit theorems for nonparametric sample entropy estimators
- Local linear estimation of residual entropy function of conditional distributions
- Nonparametric confidence intervals for the integral of a function of an unknown density
- On integral functionals of a density
- Non-parametric estimation of the extropy and the entropy measures based on progressive type-II censored data with testing uniformity
- Uniform-in-bandwidth consistency for kernel-type estimators of Shannon's entropy
- Exact and asymptotically optimal bandwidths for kernel estimation of density functionals
- Two measures of sample entropy
- Subset selection algorithm based on mutual information
- Smoothed kernel estimation of bivariate residual entropy function
- On estimating the residual Rényi entropy under progressive censoring
- Fourier series-based direct plug-in bandwidth selectors for kernel density estimation
- Estimation of entropy and other functionals of a multivariate density
- USING THE MUTUAL INFORMATION COEFFICIENT TO IDENTIFY LAGS IN NONLINEAR MODELS
- A note on the adaptive estimation of the differential entropy by wavelet methods
- A note on the strong consistency of nonparametric estimation of Shannon entropy in length-biased sampling
- Testing normality based on new entropy estimators
- Strongly consistent estimators of k-th order regression curves and rates of convergence
- Estimation of an entropy-based functional
- New kernel-type estimator of Shanonn's entropy
- Big data and the central limit theorem: a statistical legend
- Density-free convergence properties of various estimators of entropy
This page was built for publication: On the Estimation of Functionals of the Probability Density and Its Derivatives
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4055007)