Domain adaptation -- can quantity compensate for quality? (Q2248544): Difference between revisions
From MaRDI portal
Set OpenAlex properties. |
ReferenceBot (talk | contribs) Changed an Item |
||
Property / cites work | |||
Property / cites work: On the Hardness of Domain Adaptation and the Utility of Unlabeled Target Samples / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Fast rates for support vector machines using Gaussian kernels / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: \(\epsilon\)-nets and simplex range queries / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Understanding Machine Learning / rank | |||
Normal rank |
Latest revision as of 17:04, 8 July 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Domain adaptation -- can quantity compensate for quality? |
scientific article |
Statements
Domain adaptation -- can quantity compensate for quality? (English)
0 references
26 June 2014
0 references
In this paper, the authors try to respond to the question: ``under which circumstances large sizes of not-perfectly-representative training samples can guarantee that the learned classifier performs just as well as one learned from target generated samples?'' Their answer is a positive one, particularly for a Nearest Neighbor algorithm, under specific assumptions. Furthermore, they show that, when the output classifier has to come from a predefined class, any learner needs access to data generated from the target distribution.
0 references
machine learning
0 references
domain adaptation
0 references
sample complexity
0 references