The following pages link to (Q3289420):
Displaying 15 items.
- Convergence of Markov chains in information divergence (Q1014048) (← links)
- Entropy meaning of summability of the logarithm (Q1086910) (← links)
- The positive-divergence and blowing-up properties (Q1327514) (← links)
- A unified definition of mutual information with applications in machine learning (Q1664976) (← links)
- How many queries are needed to distinguish a truncated random permutation from a random function? (Q1747661) (← links)
- Deformed statistics Kullback-Leibler divergence minimization within a scaled Bregman framework (Q1928038) (← links)
- Necessary criterion for approximate recoverability (Q1991514) (← links)
- A categorical characterization of relative entropy on standard Borel spaces (Q2130589) (← links)
- Santha-Vazirani sources, deterministic condensers and very strong extractors (Q2195575) (← links)
- Gelfand-Yaglom-Perez theorem for generalized relative entropy functionals (Q2465342) (← links)
- On the f-divergence and singularity of probability measures (Q2556186) (← links)
- A variational characterization of one-dimensional countable state Gibbs random fields (Q3219549) (← links)
- (Q4605193) (← links)
- (Q4909633) (← links)
- (Q5538634) (← links)