Estimation of Entropy and Mutual Information
From MaRDI portal
Recommendations
Cites work
- scientific article; zbMATH DE number 1164155 (Why is no real title available?)
- A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations
- Achieving information bounds in non and semiparametric models
- An Efron-Stein inequality for nonsymmetric statistics
- Consistency of data-driven histogram methods for density estimation and classification
- Convergence properties of functional estimates for discrete distributions
- Cramér-Rao type integral inequalities for general loss functions
- Estimation of the information by an adaptive partitioning of the observation space
- Geometrizing rates of convergence. II
- Limit theorems for the logarithm of sample spacings
- Minimax lower bounds and moduli of continuity
- On Choosing and Bounding Probability Metrics
- On a Class of Problems Related to the Random Division of an Interval
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- On the mathematical foundations of learning
- Second-order noiseless source coding theorems
- The jackknife estimate of variance
- Weighted sums of certain dependent random variables
Cited in
(only showing first 100 items - show all)- Quadratic Tsallis entropy bias and generalized maximum entropy models
- Statistical estimation of conditional Shannon entropy
- Limit theorems for empirical Rényi entropy and divergence with applications to molecular diversity analysis
- A Finite-Sample, Distribution-Free, Probabilistic Lower Bound on Mutual Information
- Learn from your faults: leakage assessment in fault attacks using deep learning
- Near-Optimal Learning of Tree-Structured Distributions by Chow and Liu
- Statistical mechanics of the US supreme court
- Infragranular layers lead information flow during slow oscillations according to information directionality indicators
- Synergy, redundancy, and multivariate information measures: an experimentalist's perspective
- Asymptotic normality for plug-in estimators of diversity indices on countable alphabets
- A computationally efficient estimator for mutual information
- On the Kozachenko-Leonenko entropy estimator
- Estimating the errors on measured entropy and mutual information
- On directed information theory and Granger causality graphs
- A robust-equitable measure for feature ranking and selection
- Variance estimators for the Lempel-Ziv entropy rate estimator
- Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity
- Sample complexity of the distinct elements problem
- Analytical Calculation of Mutual Information between Weakly Coupled Poisson-Spiking Neurons in Models of Dynamically Gated Communication
- Geometric k-nearest neighbor estimation of entropy and mutual information
- On spatially correlated observations in importance sampling methods for subsidence estimation
- Information processing in the LGN: a comparison of neural codes and cell types
- A review of symbolic dynamics and symbolic reconstruction of dynamical systems
- Efficient Markov chain Monte Carlo methods for decoding neural spike trains
- Data-Robust Tight Lower Bounds to the Information Carried by Spike Times of a Neuronal Population
- Flow complexity in open systems: interlacing complexity index based on mutual information
- Reliability of information-based integration of EEG and fMRI data: a simulation study
- Approximate profile maximum likelihood
- Entropy estimation in Turing's perspective
- Entropic representation and estimation of diversity indices
- An empirical study of the maximal and total information coefficients and leading measures of dependence
- Dependency reduction with divisive normalization: justification and effectiveness
- Topological features determining the error in the inference of networks using transfer entropy
- Split-door criterion: identification of causal effects through auxiliary outcomes
- Similarity of interspike interval distributions and information gain in a stationary neuronal firing
- Efficient feature selection using shrinkage estimators
- Entropy, mutual information, and systematic measures of structured spiking neural networks
- Entropy estimation via uniformization
- Optimal bandwidth selection for re-substitution entropy estimation
- Representation of Mutual Information Via Input Estimates
- Calculation of information flow rate from mutual information
- On an entropy conservation principle
- An information-theoretic approach to study spatial dependencies in small datasets
- Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances
- Information divergence estimation based on data-dependent partitions
- Nonparametric estimation of Kullback-Leibler divergence
- Variational representations and neural network estimation of Rényi divergences
- Statistical estimation of the Shannon entropy
- Warped phase coherence: An empirical synchronization measure combining phase and amplitude information
- Statistical estimation of mutual information for mixed model
- Model-based decoding, information estimation, and change-point detection techniques for multineuron spike trains
- Estimation bias in maximum entropy models
- A kernel-based calculation of information on a metric space
- Estimating functions of distributions defined over spaces of unknown size
- Non-parametric estimation of mutual information through the entropy of the linkage
- Adaptive Design Optimization: A Mutual Information-Based Approach to Model Discrimination in Cognitive Science
- Identification Entropy
- Coincidences and estimation of entropies of random variables with large cardinalities
- Analyzing Neural Responses to Natural Signals: Maximally Informative Dimensions
- Detecting and testing altered brain connectivity networks with \(k\)-partite network topology
- Nonparametric estimation of information-based measures of statistical dispersion
- Learning and generalization with the information bottleneck
- Limits to extreme event forecasting in chaotic systems
- A mutual information-based \(k\)-sample test for discrete distributions
- On the estimation of entropy
- Estimation of generalized entropies with sample spacing
- Information theory in neuroscience
- Causality and Bayesian network PDEs for multiscale representations of porous media
- Tight Data-Robust Bounds to Mutual Information Combining Shuffling and Model Selection Techniques
- Understanding Policy Diffusion in the U.S.: An Information-Theoretical Approach to Unveil Connectivity Structures in Slowly Evolving Complex Systems
- Large-sample asymptotic approximations for the sampling and posterior distributions of differential entropy for multivariate normal distributions
- Identification of sparse neural functional connectivity using penalized likelihood estimation and basis functions
- On the estimation of entropy for non-negative data
- Optimal Estimation of Wasserstein Distance on a Tree With an Application to Microbiome Studies
- Quantifying information conveyed by large neuronal populations
- Equitability, mutual information, and the maximal information coefficient
- Bayesian and quasi-Bayesian estimators for mutual information from discrete data
- A simple method for estimating the entropy of neural activity
- Applying the multivariate time-rescaling theorem to neural population models
- A multivariate extension of mutual information for growing neural networks
- Large-scale multiple inference of collective dependence with applications to protein function
- Information-theoretic bounds and approximations in neural population coding
- The relation between Granger causality and directed information theory: a review
- Properties of Shannon and Rényi entropies of the Poisson distribution as the functions of intensity parameter
- Smoothed noise contrastive mutual information neural estimation
- Sublinear algorithms for approximating string compressibility
- Sequential Fixed-Point ICA Based on Mutual Information Minimization
- Cost-constrained group feature selection using information theory
- Estimating Information Rates with Confidence Intervals in Neural Spike Trains
- Jensen-Mercer inequality for uniformly convex functions with some applications
- Information estimators for weighted observations
- Robust sensitivity analysis for stochastic systems
- A nonparametric two‐sample test using a general φ‐divergence‐based mutual information
- GENERALIZED CELLULAR NEURAL NETWORKS (GCNNs) CONSTRUCTED USING PARTICLE SWARM OPTIMIZATION FOR SPATIO-TEMPORAL EVOLUTIONARY PATTERN IDENTIFICATION
- Bootstrap methods for the empirical study of decision-making and information flows in social systems
- Minimum mutual information and non-Gaussianity through the maximum entropy method: estimation from finite samples
- Hybrid statistical estimation of mutual information and its application to information flow
- Non-parametric entropy estimators based on simple linear regression
- Evaluation of mutual information estimators for time series
- Entropy inference and the James-Stein estimator, with application to nonlinear gene association networks
This page was built for publication: Estimation of Entropy and Mutual Information
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4816848)