Classification with asymmetric label noise: consistency and maximal denoising
From MaRDI portal
(Redirected from Publication:315419)
Abstract: In many real-world classification problems, the labels of training examples are randomly corrupted. Most previous theoretical work on classification with label noise assumes that the two classes are separable, that the label noise is independent of the true class label, or that the noise proportions for each class are known. In this work, we give conditions that are necessary and sufficient for the true class-conditional distributions to be identifiable. These conditions are weaker than those analyzed previously, and allow for the classes to be nonseparable and the noise levels to be asymmetric and unknown. The conditions essentially state that a majority of the observed labels are correct and that the true class-conditional distributions are "mutually irreducible," a concept we introduce that limits the similarity of the two distributions. For any label noise problem, there is a unique pair of true class-conditional distributions satisfying the proposed conditions, and we argue that this pair corresponds in a certain sense to maximal denoising of the observed distributions. Our results are facilitated by a connection to "mixture proportion estimation," which is the problem of estimating the maximal proportion of one distribution that is present in another. We establish a novel rate of convergence result for mixture proportion estimation, and apply this to obtain consistency of a discrimination rule based on surrogate loss minimization. Experimental results on benchmark data and a nuclear particle classification problem demonstrate the efficacy of our approach.
Recommendations
- Corrigendum to: ``Classification with asymmetric label noise: consistency and maximal denoising
- Convex and non-convex approaches for statistical inference with class-conditional noisy labels
- How to handle noisy labels for robust learning from uncertainty
- Classification with label noise: a Markov chain sampling framework
- On classifier behavior in the presence of mislabeling noise
- Asymmetric Error Control Under Imperfect Supervision: A Label-Noise-Adjusted Neyman–Pearson Umbrella Algorithm
- Learning from binary labels with instance-dependent noise
- scientific article; zbMATH DE number 6982911
- Margin-based generalization for classifications with input noise
Cites work
- scientific article; zbMATH DE number 3984308 (Why is no real title available?)
- scientific article; zbMATH DE number 51536 (Why is no real title available?)
- scientific article; zbMATH DE number 1333815 (Why is no real title available?)
- scientific article; zbMATH DE number 893887 (Why is no real title available?)
- Boosting in the presence of noise
- Calibrated asymmetric surrogate losses
- Classification with asymmetric label noise: consistency and maximal denoising
- Composite binary losses
- Convexity, Classification, and Risk Bounds
- Efficient noise-tolerant learning from statistical queries
- Foundations of machine learning
- Multi-instance learning with any hypothesis class
- Noise-tolerant distribution-free learning of general geometric concepts
- Random classification noise defeats all convex potential boosters
- Robust supervised classification with mixture models: learning from data with uncertain labels
- Semi-supervised novelty detection
- Strong uniform times and finite random walks
- Support Vector Machines
- Tutorial on practical prediction theory for classification
Cited in
(20)- On the noise estimation statistics
- Learning from binary labels with instance-dependent noise
- Estimating the class prior for positive and unlabelled data via logistic regression
- Corrigendum to: ``Classification with asymmetric label noise: consistency and maximal denoising
- Principled analytic classifier for positive-unlabeled learning via weighted integral probability metric
- Classification trees with mismeasured responses
- Binary classification with corrupted labels
- A theory of learning with corrupted labels
- Classification with imperfect training labels
- Positive-unlabeled classification under class-prior shift: a prior-invariant approach based on density ratio estimation
- Adaptive transfer learning
- Boosting in the presence of outliers: adaptive classification with nonconvex loss functions
- Classification with asymmetric label noise: consistency and maximal denoising
- Harmless label noise and informative soft-labels in supervised classification
- Learning from positive and unlabeled data: a survey
- scientific article; zbMATH DE number 7370519 (Why is no real title available?)
- Classification with label noise: a Markov chain sampling framework
- scientific article; zbMATH DE number 6982911 (Why is no real title available?)
- Asymmetric Error Control Under Imperfect Supervision: A Label-Noise-Adjusted Neyman–Pearson Umbrella Algorithm
- Decontamination of mutual contamination models
This page was built for publication: Classification with asymmetric label noise: consistency and maximal denoising
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q315419)