Multiclass boosting with adaptive group-based \(k\)NN and its application in text categorization (Q1955197): Difference between revisions
From MaRDI portal
Set profile property. |
ReferenceBot (talk | contribs) Changed an Item |
||
(One intermediate revision by one other user not shown) | |||
Property / full work available at URL | |||
Property / full work available at URL: https://doi.org/10.1155/2012/793490 / rank | |||
Normal rank | |||
Property / OpenAlex ID | |||
Property / OpenAlex ID: W2062644002 / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Improved boosting algorithms using confidence-rated predictions / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Q4450936 / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Automatic hardware implementation tool for a discrete Adaboost-based decision algorithm / rank | |||
Normal rank |
Latest revision as of 13:02, 6 July 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Multiclass boosting with adaptive group-based \(k\)NN and its application in text categorization |
scientific article |
Statements
Multiclass boosting with adaptive group-based \(k\)NN and its application in text categorization (English)
0 references
11 June 2013
0 references
Summary: AdaBoost is an excellent committee-based tool for classification. However, its effectiveness and efficiency in multiclass categorization face the challenges from methods based on support vector machine (SVM), neural networks (NN), naïve Bayes, and \(k\)-nearest neighbor (\(k\)NN). This paper uses a novel multi-class AdaBoost algorithm to avoid reducing the multi-class classification problem to multiple two-class classification problems. This novel method is more effective. In addition, it keeps the accuracy advantage of existing AdaBoost. An adaptive group-based \(k\)NN method is proposed in this paper to build more accurate weak classifiers and in this way control the number of basis classifiers in an acceptable range. To further enhance the performance, weak classifiers are combined into a strong classifier through a double iterative weighted way and construct an adaptive group-based kNN boosting algorithm (AG\(k\)NN-AdaBoost). We implement AG\(k\)NN-AdaBoost in a Chinese text categorization system. Experimental results showed that the classification algorithm proposed in this paper has better performance both in precision and recall than many other text categorization methods including traditional AdaBoost. In addition, the processing speed is significantly enhanced than original AdaBoost and many other classic categorization algorithms.
0 references