Pages that link to "Item:Q809614"
From MaRDI portal
The following pages link to Learnability with respect to fixed distributions (Q809614):
Displaying 19 items.
- On universal learning algorithms (Q287154) (← links)
- Supervised learning and co-training (Q391741) (← links)
- Rigorous learning curve bounds from statistical mechanics (Q676243) (← links)
- Using the doubling dimension to analyze the generalization of learning algorithms (Q923877) (← links)
- An upper bound on the sample complexity of PAC-learning halfspaces with respect to the uniform distribution (Q1014429) (← links)
- Prediction, learning, uniform convergence, and scale-sensitive dimensions (Q1271550) (← links)
- Sample size lower bounds in PAC learning by Algorithmic Complexity Theory (Q1274920) (← links)
- Nonuniform learnability (Q1329161) (← links)
- A sufficient condition for polynomial distribution-dependent learnability (Q1364775) (← links)
- Learning distributions by their density levels: A paradigm for learning without a teacher (Q1370866) (← links)
- A general lower bound on the number of examples needed for learning (Q1823011) (← links)
- Improved lower bounds for learning from noisy examples: An information-theoretic approach (Q1854425) (← links)
- Inductive inference in the limit of empirically adequate theories (Q1902559) (← links)
- Learning under \(p\)-tampering poisoning attacks (Q2202514) (← links)
- When are epsilon-nets small? (Q2304628) (← links)
- Smart PAC-learners (Q2431423) (← links)
- A fixed-distribution PAC learning theory for neural FIR models (Q2490395) (← links)
- Sample Complexity Bounds on Differentially Private Learning via Communication Complexity (Q3454521) (← links)
- Smart PAC-Learners (Q3648763) (← links)