Pages that link to "Item:Q2772919"
From MaRDI portal
The following pages link to Convergence properties of functional estimates for discrete distributions (Q2772919):
Displaying 33 items.
- Coincidences and estimation of entropies of random variables with large cardinalities (Q400965) (← links)
- Deviation inequalities for separately Lipschitz functionals of iterated random functions (Q468728) (← links)
- Large-sample asymptotic approximations for the sampling and posterior distributions of differential entropy for multivariate normal distributions (Q657559) (← links)
- On convergence properties of Shannon entropy (Q734295) (← links)
- Bias adjustment for a nonparametric entropy estimator (Q742735) (← links)
- Entropy, mutual information, and systematic measures of structured spiking neural networks (Q827853) (← links)
- Mutual information in the frequency domain (Q866650) (← links)
- Assessing the dependence structure of the components of hybrid time series processes using mutual information (Q904299) (← links)
- Learning and generalization with the information bottleneck (Q982641) (← links)
- On entropy estimation for distributions with countable support. (Q1565879) (← links)
- A Bernstein-von Mises theorem for discrete probability distributions (Q1951969) (← links)
- Tsallis conditional mutual information in investigating long range correlation in symbol sequences (Q2067092) (← links)
- A quantum-mechanical derivation of the multivariate central limit theorem for Markov chains (Q2128258) (← links)
- Causality and Bayesian network PDEs for multiscale representations of porous media (Q2222317) (← links)
- Identifying anomalous signals in GPS data using HMMs: an increased likelihood of earthquakes? (Q2361174) (← links)
- Methods for diversity and overlap analysis in T-cell receptor populations (Q2435078) (← links)
- Entropy Estimation in Turing's Perspective (Q2919410) (← links)
- Information in the Nonstationary Case (Q3613610) (← links)
- (Q4636979) (← links)
- Estimation of Entropy and Mutual Information (Q4816848) (← links)
- Estimating entropy rate from censored symbolic time series: A test for time-irreversibility (Q4983669) (← links)
- The resampling of entropies with the application of biodiversity (Q5128588) (← links)
- (Q5134544) (← links)
- BIAS REDUCTION OF THE NEAREST NEIGHBOR ENTROPY ESTIMATOR (Q5322608) (← links)
- A Note on Entropy Estimation (Q5380327) (← links)
- A nonparametric two‐sample test using a general <i>φ</i>‐divergence‐based mutual information (Q6067721) (← links)
- Near-Optimal Learning of Tree-Structured Distributions by Chow and Liu (Q6110527) (← links)
- Limit distributions and sensitivity analysis for empirical entropic optimal transport on countable spaces (Q6126806) (← links)
- Optimal non-asymptotic concentration of centered empirical relative entropy in the high-dimensional regime (Q6165358) (← links)
- Gaussian concentration bounds for stochastic chains of unbounded memory (Q6187463) (← links)
- An entropy-based measure of correlation for time series (Q6490368) (← links)
- A note on a Bernstein-type inequality for the log-likelihood function of categorical variables with infinitely many levels (Q6619729) (← links)
- Leakage certification made simple (Q6652997) (← links)