Multi-stage classifier design
From MaRDI portal
Publication:374149
DOI10.1007/S10994-013-5349-4zbMATH Open1273.68306arXiv1205.4377OpenAlexW1979871898MaRDI QIDQ374149FDOQ374149
Authors: Kirill Trapeznikov, Venkatesh Saligrama, David A. Castanon
Publication date: 22 October 2013
Published in: Machine Learning (Search for Journal in Brave)
Abstract: In many classification systems, sensing modalities have different acquisition costs. It is often {it unnecessary} to use every modality to classify a majority of examples. We study a multi-stage system in a prediction time cost reduction setting, where the full data is available for training, but for a test example, measurements in a new modality can be acquired at each stage for an additional cost. We seek decision rules to reduce the average measurement acquisition cost. We formulate an empirical risk minimization problem (ERM) for a multi-stage reject classifier, wherein the stage classifier either classifies a sample using only the measurements acquired so far or rejects it to the next stage where more attributes can be acquired for a cost. To solve the ERM problem, we show that the optimal reject classifier at each stage is a combination of two binary classifiers, one biased towards positive examples and the other biased towards negative examples. We use this parameterization to construct stage-by-stage global surrogate risk, develop an iterative algorithm in the boosting framework and present convergence and generalization results. We test our work on synthetic, medical and explosives detection datasets. Our results demonstrate that substantial cost reduction without a significant sacrifice in accuracy is achievable.
Full work available at URL: https://arxiv.org/abs/1205.4377
Recommendations
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- The elements of statistical learning. Data mining, inference, and prediction
- Planning and acting in partially observable stochastic domains
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Boosting as a regularized path to a maximum margin classifier
- Reducing multiclass to binary: A unifying approach for margin classifiers
- On optimum recognition error and reject tradeoff
- Classification with a reject option using a hinge loss
- Advances in knowledge discovery and data mining. 7th Pacific-Asia conference, PAKDD 2003, Seoul, Korea, April 30 -- May 2, 2003. Proceedings
- Agnostic pointwise-competitive selective classification
- Cost-sensitive feature acquisition and classification
Cited In (1)
This page was built for publication: Multi-stage classifier design
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q374149)