Estimation of Entropy and Mutual Information
From MaRDI portal
Publication:4816848
DOI10.1162/089976603321780272zbMath1052.62003OpenAlexW2114771311MaRDI QIDQ4816848
Publication date: 14 September 2004
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/089976603321780272
Nonparametric estimation (62G05) Central limit and other weak theorems (60F05) Statistical aspects of information-theoretic topics (62B10)
Related Items (only showing first 100 items - show all)
Dependency Reduction with Divisive Normalization: Justification and Effectiveness ⋮ Estimation bias in maximum entropy models ⋮ A kernel-based calculation of information on a metric space ⋮ Estimating functions of distributions defined over spaces of unknown size ⋮ Entropy, mutual information, and systematic measures of structured spiking neural networks ⋮ Similarity of interspike interval distributions and information gain in a stationary neuronal firing ⋮ Encoding stimulus information by spike numbers and mean response time in primary auditory cortex ⋮ Estimation of generalized entropies with sample spacing ⋮ QUADRATIC TSALLIS ENTROPY BIAS AND GENERALIZED MAXIMUM ENTROPY MODELS ⋮ EVALUATION OF MUTUAL INFORMATION ESTIMATORS FOR TIME SERIES ⋮ An empirical study of the maximal and total information coefficients and leading measures of dependence ⋮ Unnamed Item ⋮ Understanding Policy Diffusion in the U.S.: An Information-Theoretical Approach to Unveil Connectivity Structures in Slowly Evolving Complex Systems ⋮ Entropy Estimation in Turing's Perspective ⋮ Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity ⋮ Analyzing Neural Responses to Natural Signals: Maximally Informative Dimensions ⋮ Convergence of Monte Carlo distribution estimates from rival samplers ⋮ Sublinear algorithms for approximating string compressibility ⋮ Limit theorems for empirical Rényi entropy and divergence with applications to molecular diversity analysis ⋮ Investigation on the high-order approximation of the entropy bias ⋮ Non-parametric entropy estimators based on simple linear regression ⋮ GENERALIZED CELLULAR NEURAL NETWORKS (GCNNs) CONSTRUCTED USING PARTICLE SWARM OPTIMIZATION FOR SPATIO-TEMPORAL EVOLUTIONARY PATTERN IDENTIFICATION ⋮ A unified definition of mutual information with applications in machine learning ⋮ Independent subspace analysis of the sea surface temperature variability: non-Gaussian sources and sensitivity to sampling and dimensionality ⋮ A multivariate extension of mutual information for growing neural networks ⋮ Understanding autoencoders with information theoretic concepts ⋮ Optimal Estimation of Wasserstein Distance on a Tree With an Application to Microbiome Studies ⋮ Coincidences and estimation of entropies of random variables with large cardinalities ⋮ Smoothed noise contrastive mutual information neural estimation ⋮ A mutual information-basedk-sample test for discrete distributions ⋮ A nonparametric two‐sample test using a general φ‐divergence‐based mutual information ⋮ Cost-constrained group feature selection using information theory ⋮ An improved estimator of Shannon entropy with applications to systems with memory ⋮ Flow complexity in open systems: interlacing complexity index based on mutual information ⋮ Ordinal symbolic analysis and its application to biomedical recordings ⋮ Learn from your faults: leakage assessment in fault attacks using deep learning ⋮ Unnamed Item ⋮ Equitability, mutual information, and the maximal information coefficient ⋮ A mutual information estimator with exponentially decaying bias ⋮ An Automatic Inequality Prover and Instance Optimal Identity Testing ⋮ Identification of sparse neural functional connectivity using penalized likelihood estimation and basis functions ⋮ Synergy, redundancy, and multivariate information measures: an experimentalist's perspective ⋮ Infragranular layers lead information flow during slow oscillations according to information directionality indicators ⋮ Asymptotic normality for plug-in estimators of diversity indices on countable alphabets ⋮ Variance estimators for the Lempel-Ziv entropy rate estimator ⋮ Reliability of Information-Based Integration of EEG and fMRI Data: A Simulation Study ⋮ Analytical Calculation of Mutual Information between Weakly Coupled Poisson-Spiking Neurons in Models of Dynamically Gated Communication ⋮ Information estimators for weighted observations ⋮ Nonparametric Estimation of Küllback-Leibler Divergence ⋮ Causality and Bayesian network PDEs for multiscale representations of porous media ⋮ Warped phase coherence: An empirical synchronization measure combining phase and amplitude information ⋮ Large-sample asymptotic approximations for the sampling and posterior distributions of differential entropy for multivariate normal distributions ⋮ Sequential Fixed-Point ICA Based on Mutual Information Minimization ⋮ Split-door criterion: identification of causal effects through auxiliary outcomes ⋮ Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances ⋮ A simple method for estimating the entropy of neural activity ⋮ Sample complexity of the distinct elements problem ⋮ Edgeworth Approximation of Multivariate Differential Entropy ⋮ Data-Robust Tight Lower Bounds to the Information Carried by Spike Times of a Neuronal Population ⋮ Adaptive Design Optimization: A Mutual Information-Based Approach to Model Discrimination in Cognitive Science ⋮ Quantifying Information Conveyed by Large Neuronal Populations ⋮ Statistical estimation of mutual information for mixed model ⋮ On the estimation of entropy for non-negative data ⋮ Information-Theoretic Bounds and Approximations in Neural Population Coding ⋮ Statistical mechanics of the US supreme court ⋮ Large-scale multiple inference of collective dependence with applications to protein function ⋮ Applying the Multivariate Time-Rescaling Theorem to Neural Population Models ⋮ An information-theoretic approach to study spatial dependencies in small datasets ⋮ Variational Representations and Neural Network Estimation of Rényi Divergences ⋮ Justifying Additive Noise Model-Based Causal Discovery via Algorithmic Information Theory ⋮ Estimating Entropy Rates with Bayesian Confidence Intervals ⋮ Detecting and testing altered brain connectivity networks with \(k\)-partite network topology ⋮ How Synaptic Release Probability Shapes Neuronal Transmission: Information-Theoretic Analysis in a Cerebellar Granule Cell ⋮ Learning and generalization with the information bottleneck ⋮ Tight Data-Robust Bounds to Mutual Information Combining Shuffling and Model Selection Techniques ⋮ Information divergence estimation based on data-dependent partitions ⋮ Encoding uncertainty in the hippocampus ⋮ A Finite-Sample, Distribution-Free, Probabilistic Lower Bound on Mutual Information ⋮ Model-Based Decoding, Information Estimation, and Change-Point Detection Techniques for Multineuron Spike Trains ⋮ Efficient Markov Chain Monte Carlo Methods for Decoding Neural Spike Trains ⋮ Least-Squares Independent Component Analysis ⋮ Estimating Information Rates with Confidence Intervals in Neural Spike Trains ⋮ The relation between Granger causality and directed information theory: a review ⋮ Minimum mutual information and non-Gaussianity through the maximum entropy method: estimation from finite samples ⋮ Bayesian and quasi-Bayesian estimators for mutual information from discrete data ⋮ Bias adjustment for a nonparametric entropy estimator ⋮ Bootstrap methods for the empirical study of decision-making and information flows in social systems ⋮ Statistical estimation of conditional Shannon entropy ⋮ The impact of clean spark spread expectations on storage hydropower generation ⋮ Entropic representation and estimation of diversity indices ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Robust Sensitivity Analysis for Stochastic Systems ⋮ Information processing in the LGN: a comparison of neural codes and cell types ⋮ Efficient feature selection using shrinkage estimators ⋮ The Spike-Triggered Average of the Integrate-and-Fire Cell Driven by Gaussian White Noise ⋮ Topological features determining the error in the inference of networks using transfer entropy ⋮ Unnamed Item ⋮ Near-Optimal Learning of Tree-Structured Distributions by Chow and Liu
Cites Work
- Unnamed Item
- Achieving information bounds in non and semiparametric models
- An Efron-Stein inequality for nonsymmetric statistics
- The jackknife estimate of variance
- Geometrizing rates of convergence. II
- Minimax lower bounds and moduli of continuity
- Consistency of data-driven histogram methods for density estimation and classification
- Limit theorems for the logarithm of sample spacings
- Weighted sums of certain dependent random variables
- On the mathematical foundations of learning
- Convergence properties of functional estimates for discrete distributions
- Second-order noiseless source coding theorems
- Estimation of the information by an adaptive partitioning of the observation space
- On Choosing and Bounding Probability Metrics
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations
- On a Class of Problems Related to the Random Division of an Interval
- Cramér-Rao type integral inequalities for general loss functions
This page was built for publication: Estimation of Entropy and Mutual Information