Binary Linear Classification and Feature Selection via Generalized Approximate Message Passing

From MaRDI portal
Publication:4580512

DOI10.1109/TSP.2015.2407311zbMATH Open1394.62007arXiv1401.0872MaRDI QIDQ4580512FDOQ4580512

Philip Schniter, Per B. Sederberg, Justin Ziniel

Publication date: 22 August 2018

Published in: IEEE Transactions on Signal Processing (Search for Journal in Brave)

Abstract: For the problem of binary linear classification and feature selection, we propose algorithmic approaches to classifier design based on the generalized approximate message passing (GAMP) algorithm, recently proposed in the context of compressive sensing. We are particularly motivated by problems where the number of features greatly exceeds the number of training examples, but where only a few features suffice for accurate classification. We show that sum-product GAMP can be used to (approximately) minimize the classification error rate and max-sum GAMP can be used to minimize a wide variety of regularized loss functions. Furthermore, we describe an expectation-maximization (EM)-based scheme to learn the associated model parameters online, as an alternative to cross-validation, and we show that GAMP's state-evolution framework can be used to accurately predict the misclassification rate. Finally, we present a detailed numerical study to confirm the accuracy, speed, and flexibility afforded by our GAMP-based approaches to binary linear classification and feature selection.


Full work available at URL: https://arxiv.org/abs/1401.0872







Cited In (1)





This page was built for publication: Binary Linear Classification and Feature Selection via Generalized Approximate Message Passing

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4580512)