Feature selection with test cost constraint

From MaRDI portal
Publication:2353670

DOI10.1016/J.IJAR.2013.04.003zbMATH Open1316.68117arXiv1209.5601OpenAlexW1976516122MaRDI QIDQ2353670FDOQ2353670

William Zhu, Qinghua Hu, Fan Min

Publication date: 16 July 2015

Published in: International Journal of Approximate Reasoning (Search for Journal in Brave)

Abstract: Feature selection is an important preprocessing step in machine learning and data mining. In real-world applications, costs, including money, time and other resources, are required to acquire the features. In some cases, there is a test cost constraint due to limited resources. We shall deliberately select an informative and cheap feature subset for classification. This paper proposes the feature selection with test cost constraint problem for this issue. The new problem has a simple form while described as a constraint satisfaction problem (CSP). Backtracking is a general algorithm for CSP, and it is efficient in solving the new problem on medium-sized data. As the backtracking algorithm is not scalable to large datasets, a heuristic algorithm is also developed. Experimental results show that the heuristic algorithm can find the optimal solution in most cases. We also redefine some existing feature selection problems in rough sets, especially in decision-theoretic rough sets, from the viewpoint of CSP. These new definitions provide insight to some new research directions.


Full work available at URL: https://arxiv.org/abs/1209.5601




Recommendations





Cited In (18)

Uses Software





This page was built for publication: Feature selection with test cost constraint

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2353670)