Sample estimate of the entropy of a random vector

From MaRDI portal
Publication:1096264


zbMath0633.62005MaRDI QIDQ1096264

L. F. Kozachenko, Nikolai N. Leonenko

Publication date: 1987

Published in: Problems of Information Transmission (Search for Journal in Brave)


62B10: Statistical aspects of information-theoretic topics


Related Items

Asymptotics for Euclidean functionals of mixing processes, Entropy production and Vlasov equation for self-gravitating systems, A density based empirical likelihood approach for testing bivariate normality, Statistical estimation of conditional Shannon entropy, Unnamed Item, Unnamed Item, Causality of energy-containing eddies in wall turbulence, A Note on Bayesian Inference for Long-Range Dependence of a Stationary Two-State Process, Information-Maximization Clustering Based on Squared-Loss Mutual Information, A Nonparametric Clustering Algorithm with a Quantile-Based Likelihood Estimator, A nearest-neighbor based nonparametric test for viral remodeling in heterogeneous single-cell proteomic data, Non-parametric estimation of mutual information through the entropy of the linkage, Limit theory for point processes in manifolds, Nonparametric estimation of information-based measures of statistical dispersion, Is mutual information adequate for feature selection in regression?, Information estimators for weighted observations, Entropy propagation analysis in stochastic structural dynamics: application to a beam with uncertain cross sectional area, On the Kozachenko-Leonenko entropy estimator, Parametric Bayesian estimation of differential entropy and relative entropy, Nearest neighbor estimates of entropy for multivariate circular distributions, Design of computer experiments: space filling and beyond, Effect of neuromodulation of short-term plasticity on information processing in hippocampal interneuron synapses, \(k_n\)-nearest neighbor estimators of entropy, The relation between Granger causality and directed information theory: a review, An information-theoretic approach to assess practical identifiability of parametric dynamical systems, A class of Rényi information estimators for multidimensional densities, Statistical inference for the \(\epsilon \)-entropy and the quadratic Rényi entropy, Reliability of coupled oscillators, Estimating mutual information for feature selection in the presence of label noise, Non-parametric entropy estimators based on simple linear regression, On mutual information estimation for mixed-pair random variables, Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances, Statistical estimation of the Shannon entropy, Entropy-based inhomogeneity detection in fiber materials, Optimal Latin hypercube designs for the Kullback-Leibler criterion, Parametric generation of conditional geological realizations using generative neural networks, Spatio-chromatic information available from different neural layers via gaussianization, The entropy based goodness of fit tests for generalized von Mises-Fisher distributions and beyond, Hysteresis and disorder-induced order in continuous kinetic-like opinion dynamics in complex networks, Minimax estimation of norms of a probability density. I: Lower bounds, Entropy-based test for generalised Gaussian distributions, Detecting anomalies in fibre systems using 3-dimensional image data, Local nearest neighbour classification with applications to semi-supervised learning, A model-free Bayesian classifier, Decomposition in derivative-free optimization, A Bayesian nonparametric estimation to entropy, Statistical estimation of mutual information for mixed model, Large-scale multiple inference of collective dependence with applications to protein function, Similarity of interspike interval distributions and information gain in a stationary neuronal firing, On entropy estimation by \(m\)-spacing method, Statistical Inference for Rényi Entropy Functionals, Entropy production in systems with long range interactions, Calculating the Mutual Information between Two Spike Trains, Fast Parallel Estimation of High Dimensional Information Theoretical Quantities with Low Dimensional Random Projection Ensembles