Pages that link to "Item:Q1383090"
From MaRDI portal
The following pages link to Mutual information, metric entropy and cumulative relative entropy risk (Q1383090):
Displaying 17 items.
- Optimal quantization of the support of a continuous multivariate distribution based on mutual information (Q269123) (← links)
- Information-theoretic determination of minimax rates of convergence (Q1578277) (← links)
- Hölder's identity (Q1726938) (← links)
- Mixing strategies for density estimation. (Q1848770) (← links)
- Improved lower bounds for learning from noisy examples: An information-theoretic approach (Q1854425) (← links)
- Simultaneous prediction of independent Poisson observables (Q1879973) (← links)
- Loss of information of a statistic for a family of non-regular distributions. II: More general case (Q1925993) (← links)
- Convergence rates of deep ReLU networks for multiclass classification (Q2137813) (← links)
- Bayesian parametric inference in a nonparametric framework (Q2384667) (← links)
- Predictability, Complexity, and Learning (Q2784814) (← links)
- Statistical Decision Problems and Bayesian Nonparametric Methods (Q3421334) (← links)
- Catching up Faster by Switching Sooner: A Predictive Approach to Adaptive Estimation with an Application to the AIC–BIC Dilemma (Q4632670) (← links)
- Conjugate Priors Represent Strong Pre-Experimental Assumptions (Q4677092) (← links)
- Prequential analysis of complex data with adaptive model reselection (Q4969698) (← links)
- Quantifying Information Conveyed by Large Neuronal Populations (Q5154160) (← links)
- Entropy-SGD: biasing gradient descent into wide valleys (Q5854121) (← links)
- Information aware max-norm Dirichlet networks for predictive uncertainty estimation (Q6078671) (← links)