A hybrid feature selection method based on rough conditional mutual information and naive Bayesian classifier (Q469921): Difference between revisions
From MaRDI portal
Created a new Item |
ReferenceBot (talk | contribs) Changed an Item |
||
(7 intermediate revisions by 6 users not shown) | |||
Property / review text | |||
Summary: We introduced a novel hybrid feature selection method based on rough conditional mutual information and Naive Bayesian classifier. Conditional mutual information is an important metric in feature selection, but it is hard to compute. We introduce a new measure called rough conditional mutual information which is based on rough sets; it is shown that the new measure can substitute Shannon's conditional mutual information. Thus rough conditional mutual information can also be used to filter the irrelevant and redundant features. Subsequently, to reduce the feature and improve classification accuracy, a wrapper approach based on naive Bayesian classifier is used to search the optimal feature subset in the space of a candidate feature subset which is selected by filter model. Finally, the proposed algorithms are tested on several UCI datasets compared with other classical feature selection methods. The results show that our approach obtains not only high classification accuracy, but also the least number of selected features. | |||
Property / review text: Summary: We introduced a novel hybrid feature selection method based on rough conditional mutual information and Naive Bayesian classifier. Conditional mutual information is an important metric in feature selection, but it is hard to compute. We introduce a new measure called rough conditional mutual information which is based on rough sets; it is shown that the new measure can substitute Shannon's conditional mutual information. Thus rough conditional mutual information can also be used to filter the irrelevant and redundant features. Subsequently, to reduce the feature and improve classification accuracy, a wrapper approach based on naive Bayesian classifier is used to search the optimal feature subset in the space of a candidate feature subset which is selected by filter model. Finally, the proposed algorithms are tested on several UCI datasets compared with other classical feature selection methods. The results show that our approach obtains not only high classification accuracy, but also the least number of selected features. / rank | |||
Normal rank | |||
Property / Mathematics Subject Classification ID | |||
Property / Mathematics Subject Classification ID: 62H30 / rank | |||
Normal rank | |||
Property / Mathematics Subject Classification ID | |||
Property / Mathematics Subject Classification ID: 62F15 / rank | |||
Normal rank | |||
Property / zbMATH DE Number | |||
Property / zbMATH DE Number: 6368345 / rank | |||
Normal rank | |||
Property / Wikidata QID | |||
Property / Wikidata QID: Q59047432 / rank | |||
Normal rank | |||
Property / describes a project that uses | |||
Property / describes a project that uses: UCI-ml / rank | |||
Normal rank | |||
Property / describes a project that uses | |||
Property / describes a project that uses: ReliefF / rank | |||
Normal rank | |||
Property / MaRDI profile type | |||
Property / MaRDI profile type: MaRDI publication profile / rank | |||
Normal rank | |||
Property / full work available at URL | |||
Property / full work available at URL: https://doi.org/10.1155/2014/382738 / rank | |||
Normal rank | |||
Property / OpenAlex ID | |||
Property / OpenAlex ID: W2052789821 / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: 10.1162/153244303322753616 / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Consistency-based search in feature selection / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Theoretical and empirical analysis of ReliefF and RReliefF / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Q4250258 / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Supervised feature selection by clustering using conditional mutual information-based distances / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: A fast separability-based feature-selection method for high-dimensional remotely sensed image classification / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Q4863755 / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Q4727203 / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Uncertainty measures of rough set prediction / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Uncertainty-based information. Elements of generalized information theory. / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: THE INFORMATION ENTROPY, ROUGH ENTROPY AND KNOWLEDGE GRANULATION IN ROUGH SET THEORY / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: A Mathematical Theory of Communication / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Q4023085 / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Rough sets / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: 10.1162/jmlr.2003.4.7-8.1205 / rank | |||
Normal rank | |||
links / mardi / name | links / mardi / name | ||
Latest revision as of 06:21, 9 July 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | A hybrid feature selection method based on rough conditional mutual information and naive Bayesian classifier |
scientific article |
Statements
A hybrid feature selection method based on rough conditional mutual information and naive Bayesian classifier (English)
0 references
11 November 2014
0 references
Summary: We introduced a novel hybrid feature selection method based on rough conditional mutual information and Naive Bayesian classifier. Conditional mutual information is an important metric in feature selection, but it is hard to compute. We introduce a new measure called rough conditional mutual information which is based on rough sets; it is shown that the new measure can substitute Shannon's conditional mutual information. Thus rough conditional mutual information can also be used to filter the irrelevant and redundant features. Subsequently, to reduce the feature and improve classification accuracy, a wrapper approach based on naive Bayesian classifier is used to search the optimal feature subset in the space of a candidate feature subset which is selected by filter model. Finally, the proposed algorithms are tested on several UCI datasets compared with other classical feature selection methods. The results show that our approach obtains not only high classification accuracy, but also the least number of selected features.
0 references
0 references
0 references