Publication:4558567
From MaRDI portal
zbMath1473.68174MaRDI QIDQ4558567
Robert C. Williamson, Brendan van Rooyen
Publication date: 22 November 2018
Full work available at URL: http://jmlr.csail.mit.edu/papers/v18/16-315.html
noise; decision theory; supervised learning; data processing; minimax bounds; generalized supervision
62H30: Classification and discrimination; cluster analysis (statistical aspects)
62G05: Nonparametric estimation
68T05: Learning and adaptive systems in artificial intelligence
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Classification with asymmetric label noise: consistency and maximal denoising
- Convexity and well-posed problems
- Risk bounds for statistical learning
- The geometry of proper scoring rules
- Fast learning rates for plug-in classifiers
- Relative entropy under mappings by stochastic matrices
- Information-theoretic determination of minimax rates of convergence
- Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory
- Support-vector networks
- Empirical minimization
- Efficient noise-tolerant learning from statistical queries
- Support Vector Machines
- Information-theoretic upper and lower bounds for statistical estimation
- A discriminative model for semi-supervised learning
- Covering numbers for real-valued function classes
- Conditioning as disintegration
- Information-Theoretic Lower Bounds on the Oracle Complexity of Stochastic Convex Optimization
- Lower Bounds for the Minimax Risk Using $f$-Divergences, and Applications
- Strictly Proper Scoring Rules, Prediction, and Estimation
- Sufficiency and Approximate Sufficiency
- Discrete-Variable Extremum Problems
- Information, Divergence and Risk for Binary Experiments
- Prediction, Learning, and Games
- Uncertainty, Information, and Sequential Experiments
- Convexity, Classification, and Risk Bounds