Pages that link to "Item:Q4816848"
From MaRDI portal
The following pages link to Estimation of Entropy and Mutual Information (Q4816848):
Displayed 50 items.
- A mutual information-based<i>k</i>-sample test for discrete distributions (Q2953264) (← links)
- Ordinal symbolic analysis and its application to biomedical recordings (Q2955727) (← links)
- Equitability, mutual information, and the maximal information coefficient (Q2962211) (← links)
- An Automatic Inequality Prover and Instance Optimal Identity Testing (Q2968159) (← links)
- Applying the Multivariate Time-Rescaling Theorem to Neural Population Models (Q3016184) (← links)
- Estimating Entropy Rates with Bayesian Confidence Intervals (Q3025070) (← links)
- Model-Based Decoding, Information Estimation, and Change-Point Detection Techniques for Multineuron Spike Trains (Q3070780) (← links)
- Efficient Markov Chain Monte Carlo Methods for Decoding Neural Spike Trains (Q3070781) (← links)
- Least-Squares Independent Component Analysis (Q3070790) (← links)
- Understanding Policy Diffusion in the U.S.: An Information-Theoretical Approach to Unveil Connectivity Structures in Slowly Evolving Complex Systems (Q3188149) (← links)
- A simple method for estimating the entropy of neural activity (Q3301563) (← links)
- The Spike-Triggered Average of the Integrate-and-Fire Cell Driven by Gaussian White Noise (Q3413080) (← links)
- QUADRATIC TSALLIS ENTROPY BIAS AND GENERALIZED MAXIMUM ENTROPY MODELS (Q3462271) (← links)
- Variance estimators for the Lempel-Ziv entropy rate estimator (Q3531679) (← links)
- Adaptive Design Optimization: A Mutual Information-Based Approach to Model Discrimination in Cognitive Science (Q3556782) (← links)
- Justifying Additive Noise Model-Based Causal Discovery via Algorithmic Information Theory (Q3573102) (← links)
- How Synaptic Release Probability Shapes Neuronal Transmission: Information-Theoretic Analysis in a Cerebellar Granule Cell (Q3583492) (← links)
- Flow complexity in open systems: interlacing complexity index based on mutual information (Q4594151) (← links)
- Warped phase coherence: An empirical synchronization measure combining phase and amplitude information (Q4627625) (← links)
- (Q4636979) (← links)
- Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity (Q4819813) (← links)
- Analyzing Neural Responses to Natural Signals: Maximally Informative Dimensions (Q4819822) (← links)
- Statistical estimation of conditional Shannon entropy (Q4967803) (← links)
- (Q5002623) (← links)
- (Q5134626) (← links)
- (Q5134627) (← links)
- (Q5134628) (← links)
- Quantifying Information Conveyed by Large Neuronal Populations (Q5154160) (← links)
- Information-Theoretic Bounds and Approximations in Neural Population Coding (Q5157152) (← links)
- An information-theoretic approach to study spatial dependencies in small datasets (Q5161106) (← links)
- Variational Representations and Neural Network Estimation of Rényi Divergences (Q5162628) (← links)
- A Finite-Sample, Distribution-Free, Probabilistic Lower Bound on Mutual Information (Q5198608) (← links)
- (Q5214211) (← links)
- EVALUATION OF MUTUAL INFORMATION ESTIMATORS FOR TIME SERIES (Q5306410) (← links)
- GENERALIZED CELLULAR NEURAL NETWORKS (GCNNs) CONSTRUCTED USING PARTICLE SWARM OPTIMIZATION FOR SPATIO-TEMPORAL EVOLUTIONARY PATTERN IDENTIFICATION (Q5322594) (← links)
- Asymptotic normality for plug-in estimators of diversity indices on countable alphabets (Q5375957) (← links)
- Reliability of Information-Based Integration of EEG and fMRI Data: A Simulation Study (Q5380197) (← links)
- Analytical Calculation of Mutual Information between Weakly Coupled Poisson-Spiking Neurons in Models of Dynamically Gated Communication (Q5380637) (← links)
- Nonparametric Estimation of Küllback-Leibler Divergence (Q5383802) (← links)
- Sequential Fixed-Point ICA Based on Mutual Information Minimization (Q5387455) (← links)
- Tight Data-Robust Bounds to Mutual Information Combining Shuffling and Model Selection Techniques (Q5441302) (← links)
- Estimating Information Rates with Confidence Intervals in Neural Spike Trains (Q5457581) (← links)
- Edgeworth Approximation of Multivariate Differential Entropy (Q5706652) (← links)
- Data-Robust Tight Lower Bounds to the Information Carried by Spike Times of a Neuronal Population (Q5706656) (← links)
- Optimal Estimation of Wasserstein Distance on a Tree With an Application to Microbiome Studies (Q6040684) (← links)
- Smoothed noise contrastive mutual information neural estimation (Q6061149) (← links)
- A nonparametric two‐sample test using a general <i>φ</i>‐divergence‐based mutual information (Q6067721) (← links)
- Cost-constrained group feature selection using information theory (Q6072457) (← links)
- Learn from your faults: leakage assessment in fault attacks using deep learning (Q6110383) (← links)
- Near-Optimal Learning of Tree-Structured Distributions by Chow and Liu (Q6110527) (← links)