A relative decision entropy-based feature selection approach
From MaRDI portal
Publication:1678710
DOI10.1016/J.PATCOG.2015.01.023zbMATH Open1374.68391OpenAlexW2045810864MaRDI QIDQ1678710FDOQ1678710
Authors: Feng Jiang, Yuefei Sui, Lin Zhou
Publication date: 20 November 2017
Published in: Pattern Recognition (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.patcog.2015.01.023
Recommendations
- An improved heuristic feature selection algorithm based on rough set theory
- scientific article; zbMATH DE number 6400786
- A rough set approach to feature selection based on scatter search metaheuristic
- A New Heuristic Feature Selection Algorithm Based on Rough Sets
- Attribute reduction based on approximation decision entropy
feature selectionrough setsroughnessdegree of dependencyfeature significancerelative decision entropy
Cites Work
- Wrappers for feature subset selection
- A Mathematical Theory of Communication
- Rough sets
- Supervised feature selection by clustering using conditional mutual information-based distances
- Title not available (Why is that?)
- Selecting informative features with fuzzy-rough sets and its application for complex systems monitoring
- Uncertainty measures of rough set prediction
- A rough-fuzzy approach for generating classification rules
- An efficient accelerator for attribute reduction from incomplete data in rough set framework
- Hybrid attribute reduction based on a novel fuzzy-rough model and information granulation
- Title not available (Why is that?)
- An Enhanced Support Vector Machine Model for Intrusion Detection
- Exploring the boundary region of tolerance rough sets for feature selection
- Information entropy, rough entropy and knowledge granulation in incomplete information systems
- Rough sets and intelligent data analysis
- A new method for measuring the uncertainty in incomplete information systems
- Neighborhood rough set based heterogeneous feature subset selection
- Feature analysis through information granulation and fuzzy sets
- THE ALGORITHM ON KNOWLEDGE REDUCTION IN INCOMPLETE INFORMATION SYSTEMS
- Feature selection with dynamic mutual information
- Title not available (Why is that?)
- A hybrid filter/wrapper approach of feature selection using information theory
- Correntropy based feature selection using binary projection
- Title not available (Why is that?)
Cited In (17)
- Three-way attribute reducts
- Three-way class-specific attribute reducts from the information viewpoint
- Systematic attribute reductions based on double granulation structures and three-view uncertainty measures in interval-set decision systems
- Feature selection based on double-hierarchical and multiplication-optimal fusion measurement in fuzzy neighborhood rough sets
- Feature selection with SVD entropy: some modification and extension
- Three-way weighted combination-entropies based on three-layer granular structures
- Initialization of \(K\)-modes clustering using outlier detection techniques
- Ensemble learning based on approximate reducts and bootstrap sampling
- Granular-conditional-entropy-based attribute reduction for partially labeled data with proxy labels
- Feature selection with partition differentiation entropy for large-scale data sets
- Efficient feature selection based on correlation measure between continuous and discrete features
- Class-specific information measures and attribute reducts for hierarchy and systematicness
- An improved heuristic feature selection algorithm based on rough set theory
- Core-generating discretization for rough set feature selection
- Efficient attribute reduction from the viewpoint of discernibility
- A random approximate reduct-based ensemble learning approach and its application in software defect prediction
- Feature selection in mixed data: a method using a novel fuzzy rough set-based information entropy
Uses Software
This page was built for publication: A relative decision entropy-based feature selection approach
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1678710)