Estimation of Entropy and Mutual Information

From MaRDI portal
Revision as of 01:30, 8 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:4816848

DOI10.1162/089976603321780272zbMath1052.62003OpenAlexW2114771311MaRDI QIDQ4816848

Liam Paninski

Publication date: 14 September 2004

Published in: Neural Computation (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1162/089976603321780272




Related Items (only showing first 100 items - show all)

Dependency Reduction with Divisive Normalization: Justification and EffectivenessEstimation bias in maximum entropy modelsA kernel-based calculation of information on a metric spaceEstimating functions of distributions defined over spaces of unknown sizeEntropy, mutual information, and systematic measures of structured spiking neural networksSimilarity of interspike interval distributions and information gain in a stationary neuronal firingEncoding stimulus information by spike numbers and mean response time in primary auditory cortexEstimation of generalized entropies with sample spacingQUADRATIC TSALLIS ENTROPY BIAS AND GENERALIZED MAXIMUM ENTROPY MODELSEVALUATION OF MUTUAL INFORMATION ESTIMATORS FOR TIME SERIESAn empirical study of the maximal and total information coefficients and leading measures of dependenceUnnamed ItemUnderstanding Policy Diffusion in the U.S.: An Information-Theoretical Approach to Unveil Connectivity Structures in Slowly Evolving Complex SystemsEntropy Estimation in Turing's PerspectiveEstimating the Entropy Rate of Spike Trains via Lempel-Ziv ComplexityAnalyzing Neural Responses to Natural Signals: Maximally Informative DimensionsConvergence of Monte Carlo distribution estimates from rival samplersSublinear algorithms for approximating string compressibilityLimit theorems for empirical Rényi entropy and divergence with applications to molecular diversity analysisInvestigation on the high-order approximation of the entropy biasNon-parametric entropy estimators based on simple linear regressionGENERALIZED CELLULAR NEURAL NETWORKS (GCNNs) CONSTRUCTED USING PARTICLE SWARM OPTIMIZATION FOR SPATIO-TEMPORAL EVOLUTIONARY PATTERN IDENTIFICATIONA unified definition of mutual information with applications in machine learningIndependent subspace analysis of the sea surface temperature variability: non-Gaussian sources and sensitivity to sampling and dimensionalityA multivariate extension of mutual information for growing neural networksUnderstanding autoencoders with information theoretic conceptsOptimal Estimation of Wasserstein Distance on a Tree With an Application to Microbiome StudiesCoincidences and estimation of entropies of random variables with large cardinalitiesSmoothed noise contrastive mutual information neural estimationA mutual information-basedk-sample test for discrete distributionsA nonparametric two‐sample test using a general φ‐divergence‐based mutual informationCost-constrained group feature selection using information theoryAn improved estimator of Shannon entropy with applications to systems with memoryFlow complexity in open systems: interlacing complexity index based on mutual informationOrdinal symbolic analysis and its application to biomedical recordingsLearn from your faults: leakage assessment in fault attacks using deep learningUnnamed ItemEquitability, mutual information, and the maximal information coefficientA mutual information estimator with exponentially decaying biasAn Automatic Inequality Prover and Instance Optimal Identity TestingIdentification of sparse neural functional connectivity using penalized likelihood estimation and basis functionsSynergy, redundancy, and multivariate information measures: an experimentalist's perspectiveInfragranular layers lead information flow during slow oscillations according to information directionality indicatorsAsymptotic normality for plug-in estimators of diversity indices on countable alphabetsVariance estimators for the Lempel-Ziv entropy rate estimatorReliability of Information-Based Integration of EEG and fMRI Data: A Simulation StudyAnalytical Calculation of Mutual Information between Weakly Coupled Poisson-Spiking Neurons in Models of Dynamically Gated CommunicationInformation estimators for weighted observationsNonparametric Estimation of Küllback-Leibler DivergenceCausality and Bayesian network PDEs for multiscale representations of porous mediaWarped phase coherence: An empirical synchronization measure combining phase and amplitude informationLarge-sample asymptotic approximations for the sampling and posterior distributions of differential entropy for multivariate normal distributionsSequential Fixed-Point ICA Based on Mutual Information MinimizationSplit-door criterion: identification of causal effects through auxiliary outcomesEfficient multivariate entropy estimation via \(k\)-nearest neighbour distancesA simple method for estimating the entropy of neural activitySample complexity of the distinct elements problemEdgeworth Approximation of Multivariate Differential EntropyData-Robust Tight Lower Bounds to the Information Carried by Spike Times of a Neuronal PopulationAdaptive Design Optimization: A Mutual Information-Based Approach to Model Discrimination in Cognitive ScienceQuantifying Information Conveyed by Large Neuronal PopulationsStatistical estimation of mutual information for mixed modelOn the estimation of entropy for non-negative dataInformation-Theoretic Bounds and Approximations in Neural Population CodingStatistical mechanics of the US supreme courtLarge-scale multiple inference of collective dependence with applications to protein functionApplying the Multivariate Time-Rescaling Theorem to Neural Population ModelsAn information-theoretic approach to study spatial dependencies in small datasetsVariational Representations and Neural Network Estimation of Rényi DivergencesJustifying Additive Noise Model-Based Causal Discovery via Algorithmic Information TheoryEstimating Entropy Rates with Bayesian Confidence IntervalsDetecting and testing altered brain connectivity networks with \(k\)-partite network topologyHow Synaptic Release Probability Shapes Neuronal Transmission: Information-Theoretic Analysis in a Cerebellar Granule CellLearning and generalization with the information bottleneckTight Data-Robust Bounds to Mutual Information Combining Shuffling and Model Selection TechniquesInformation divergence estimation based on data-dependent partitionsEncoding uncertainty in the hippocampusA Finite-Sample, Distribution-Free, Probabilistic Lower Bound on Mutual InformationModel-Based Decoding, Information Estimation, and Change-Point Detection Techniques for Multineuron Spike TrainsEfficient Markov Chain Monte Carlo Methods for Decoding Neural Spike TrainsLeast-Squares Independent Component AnalysisEstimating Information Rates with Confidence Intervals in Neural Spike TrainsThe relation between Granger causality and directed information theory: a reviewMinimum mutual information and non-Gaussianity through the maximum entropy method: estimation from finite samplesBayesian and quasi-Bayesian estimators for mutual information from discrete dataBias adjustment for a nonparametric entropy estimatorBootstrap methods for the empirical study of decision-making and information flows in social systemsStatistical estimation of conditional Shannon entropyThe impact of clean spark spread expectations on storage hydropower generationEntropic representation and estimation of diversity indicesUnnamed ItemUnnamed ItemUnnamed ItemRobust Sensitivity Analysis for Stochastic SystemsInformation processing in the LGN: a comparison of neural codes and cell typesEfficient feature selection using shrinkage estimatorsThe Spike-Triggered Average of the Integrate-and-Fire Cell Driven by Gaussian White NoiseTopological features determining the error in the inference of networks using transfer entropyUnnamed ItemNear-Optimal Learning of Tree-Structured Distributions by Chow and Liu




Cites Work




This page was built for publication: Estimation of Entropy and Mutual Information