Pages that link to "Item:Q3025070"
From MaRDI portal
The following pages link to Estimating Entropy Rates with Bayesian Confidence Intervals (Q3025070):
Displaying 18 items.
- Coincidences and estimation of entropies of random variables with large cardinalities (Q400965) (← links)
- Bayesian and quasi-Bayesian estimators for mutual information from discrete data (Q742724) (← links)
- Predicting the synaptic information efficacy in cortical layer 5 pyramidal neurons using a minimal integrate-and-fire model (Q999412) (← links)
- Synergy, redundancy, and multivariate information measures: an experimentalist's perspective (Q1704756) (← links)
- On the permutation entropy Bayesian estimation (Q2025513) (← links)
- Information processing in the LGN: a comparison of neural codes and cell types (Q2317476) (← links)
- The permutation entropy rate equals the metric entropy rate for ergodic information sources and ergodic dynamical systems (Q2509367) (← links)
- Model-Based Decoding, Information Estimation, and Change-Point Detection Techniques for Multineuron Spike Trains (Q3070780) (← links)
- Variance estimators for the Lempel-Ziv entropy rate estimator (Q3531679) (← links)
- Indices for Testing Neural Codes (Q3544320) (← links)
- Information in the Nonstationary Case (Q3613610) (← links)
- Optimal instruments and models for noisy chaos (Q3636695) (← links)
- (Q5134626) (← links)
- (Q5134641) (← links)
- A Locally Optimal Algorithm for Estimating a Generating Partition from an Observed Time Series and Its Application to Anomaly Detection (Q5157238) (← links)
- A Finite-Sample, Distribution-Free, Probabilistic Lower Bound on Mutual Information (Q5198608) (← links)
- Tight Data-Robust Bounds to Mutual Information Combining Shuffling and Model Selection Techniques (Q5441302) (← links)
- Estimating Information Rates with Confidence Intervals in Neural Spike Trains (Q5457581) (← links)