Surrogate losses in passive and active learning (Q2008623)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Surrogate losses in passive and active learning |
scientific article |
Statements
Surrogate losses in passive and active learning (English)
0 references
26 November 2019
0 references
The \textit{active} supervised machine learning methods sequentially request the instances' labels from a large pool of unlabeled data points. The objective of active learning is to achieve similar (or greater) performance to using a fully supervised dataset with a fraction of the cost or time that it takes to label all the data by iteratively increasing the size of the carefully selected labeled data. The main contribution of this work is the investigation of the surrogate loss functions employment in the context of active learning. It is to be noted that most of the state of the art of modern learning methods either use surrogate loss as part of the optimization problems like in SVM, or a surrogate loss optimization using the iterative descent like AdaBoost.
0 references
active learning
0 references
sequential design
0 references
selective sampling
0 references
statistical learning theory
0 references
surrogate loss functions
0 references
classification
0 references
0 references