Classification with asymmetric label noise: consistency and maximal denoising
DOI10.1214/16-EJS1193zbMATH Open1347.62106arXiv1303.1208OpenAlexW2963314381MaRDI QIDQ315419FDOQ315419
Authors: Gilles Blanchard, Marek Flaska, Clayton Scott, Gregory Handy, Sara Pozzi
Publication date: 21 September 2016
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1303.1208
Recommendations
- Corrigendum to: ``Classification with asymmetric label noise: consistency and maximal denoising
- scientific article; zbMATH DE number 7306869
- How to handle noisy labels for robust learning from uncertainty
- Classification with label noise: a Markov chain sampling framework
- On classifier behavior in the presence of mislabeling noise
- Asymmetric Error Control Under Imperfect Supervision: A Label-Noise-Adjusted Neyman–Pearson Umbrella Algorithm
- Learning from binary labels with instance-dependent noise
- scientific article; zbMATH DE number 6982911
- Margin-based generalization for classifications with input noise
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Pattern recognition, speech recognition (68T10)
Cites Work
- Support Vector Machines
- Title not available (Why is that?)
- Title not available (Why is that?)
- Convexity, Classification, and Risk Bounds
- Strong uniform times and finite random walks
- Efficient noise-tolerant learning from statistical queries
- Foundations of machine learning
- Title not available (Why is that?)
- Calibrated asymmetric surrogate losses
- Random classification noise defeats all convex potential boosters
- Robust supervised classification with mixture models: learning from data with uncertain labels
- Composite binary losses
- Semi-supervised novelty detection
- Tutorial on practical prediction theory for classification
- Classification with asymmetric label noise: consistency and maximal denoising
- Noise-tolerant distribution-free learning of general geometric concepts
- Title not available (Why is that?)
- Multi-instance learning with any hypothesis class
- Boosting in the presence of noise
Cited In (20)
- Boosting in the presence of outliers: adaptive classification with nonconvex loss functions
- Adaptive transfer learning
- Classification trees with mismeasured responses
- A theory of learning with corrupted labels
- Harmless label noise and informative soft-labels in supervised classification
- Binary classification with corrupted labels
- Classification with imperfect training labels
- Title not available (Why is that?)
- Classification with asymmetric label noise: consistency and maximal denoising
- Principled analytic classifier for positive-unlabeled learning via weighted integral probability metric
- Asymmetric Error Control Under Imperfect Supervision: A Label-Noise-Adjusted Neyman–Pearson Umbrella Algorithm
- Estimating the class prior for positive and unlabelled data via logistic regression
- Decontamination of mutual contamination models
- Learning from binary labels with instance-dependent noise
- Corrigendum to: ``Classification with asymmetric label noise: consistency and maximal denoising
- Title not available (Why is that?)
- Positive-unlabeled classification under class-prior shift: a prior-invariant approach based on density ratio estimation
- Classification with label noise: a Markov chain sampling framework
- On the noise estimation statistics
- Learning from positive and unlabeled data: a survey
This page was built for publication: Classification with asymmetric label noise: consistency and maximal denoising
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q315419)