Feature selection with test cost constraint
From MaRDI portal
Publication:2353670
Abstract: Feature selection is an important preprocessing step in machine learning and data mining. In real-world applications, costs, including money, time and other resources, are required to acquire the features. In some cases, there is a test cost constraint due to limited resources. We shall deliberately select an informative and cheap feature subset for classification. This paper proposes the feature selection with test cost constraint problem for this issue. The new problem has a simple form while described as a constraint satisfaction problem (CSP). Backtracking is a general algorithm for CSP, and it is efficient in solving the new problem on medium-sized data. As the backtracking algorithm is not scalable to large datasets, a heuristic algorithm is also developed. Experimental results show that the heuristic algorithm can find the optimal solution in most cases. We also redefine some existing feature selection problems in rough sets, especially in decision-theoretic rough sets, from the viewpoint of CSP. These new definitions provide insight to some new research directions.
Recommendations
Cited in
(22)- Optimization on the complementation procedure towards efficient implementation of the index generation function
- Min-max attribute-object bireducts: on unifying models of reducts in rough set theory
- A novel method to attribute reduction based on weighted neighborhood probabilistic rough sets
- Cost-sensitive rough set approach
- Efficient parallel Boolean matrix based algorithms for computing composite rough set approximations
- Multicost Decision-Theoretic Rough Sets Based on Maximal Consistent Blocks
- Incremental updating reduction for relation decision systems with dynamic conditional relation sets
- A set-cover-based approach for the test-cost-sensitive attribute reduction problem
- Cost-sensitive three-way class-specific attribute reduction
- Cost-constrained group feature selection using information theory
- Covering-based multi-granulation fuzzy rough sets
- Divide and conquer algorithm for minimal cost feature selection with measurement error
- Cost-sensitive attribute reduction in decision-theoretic rough set models
- scientific article; zbMATH DE number 7370644 (Why is no real title available?)
- On resilient feature selection: computational foundations of \(r\)-\(\mathbb{C} \)-reducts
- Pointwise mutual information sparsely embedded feature selection
- A new effect-based roughness measure for attribute reduction in information system
- \(\ast\)-reductions in a knowledge base
- Cost-sensitive sequential three-way decision modeling using a deep neural network
- Accelerator for supervised neighborhood based attribute reduction
- Budget constrained non-monotonic feature selection
- Cost-sensitive feature selection of numeric data with measurement errors
This page was built for publication: Feature selection with test cost constraint
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2353670)