Pages that link to "Item:Q4816848"
From MaRDI portal
The following pages link to Estimation of Entropy and Mutual Information (Q4816848):
Displaying 14 items.
- Minimum mutual information and non-Gaussianity through the maximum entropy method: estimation from finite samples (Q742670) (← links)
- Encoding stimulus information by spike numbers and mean response time in primary auditory cortex (Q849495) (← links)
- A multivariate extension of mutual information for growing neural networks (Q2179069) (← links)
- Understanding autoencoders with information theoretic concepts (Q2185600) (← links)
- Similarity of interspike interval distributions and information gain in a stationary neuronal firing (Q2373107) (← links)
- Applying the Multivariate Time-Rescaling Theorem to Neural Population Models (Q3016184) (← links)
- Analyzing Neural Responses to Natural Signals: Maximally Informative Dimensions (Q4819822) (← links)
- Information-Theoretic Bounds and Approximations in Neural Population Coding (Q5157152) (← links)
- A Finite-Sample, Distribution-Free, Probabilistic Lower Bound on Mutual Information (Q5198608) (← links)
- Reliability of Information-Based Integration of EEG and fMRI Data: A Simulation Study (Q5380197) (← links)
- Analytical Calculation of Mutual Information between Weakly Coupled Poisson-Spiking Neurons in Models of Dynamically Gated Communication (Q5380637) (← links)
- Nonparametric Estimation of Küllback-Leibler Divergence (Q5383802) (← links)
- Tight Data-Robust Bounds to Mutual Information Combining Shuffling and Model Selection Techniques (Q5441302) (← links)
- Estimating Information Rates with Confidence Intervals in Neural Spike Trains (Q5457581) (← links)