Variable selection for general index models via sliced inverse regression

From MaRDI portal
Publication:480962

DOI10.1214/14-AOS1233zbMATH Open1305.62234arXiv1304.4056MaRDI QIDQ480962FDOQ480962

Jun S. Liu, Bo Jiang

Publication date: 12 December 2014

Published in: The Annals of Statistics (Search for Journal in Brave)

Abstract: Variable selection, also known as feature selection in machine learning, plays an important role in modeling high dimensional data and is key to data-driven scientific discoveries. We consider here the problem of detecting influential variables under the general index model, in which the response is dependent of predictors through an unknown function of one or more linear combinations of them. Instead of building a predictive model of the response given combinations of predictors, we model the conditional distribution of predictors given the response. This inverse modeling perspective motivates us to propose a stepwise procedure based on likelihood-ratio tests, which is effective and computationally efficient in identifying important variables without specifying a parametric relationship between predictors and the response. For example, the proposed procedure is able to detect variables with pairwise, three-way or even higher-order interactions among p predictors with a computational time of O(p) instead of O(pk) (with k being the highest order of interactions). Its excellent empirical performance in comparison with existing methods is demonstrated through simulation studies as well as real data examples. Consistency of the variable selection procedure when both the number of predictors and the sample size go to infinity is established.


Full work available at URL: https://arxiv.org/abs/1304.4056




Recommendations




Cites Work


Cited In (24)

Uses Software





This page was built for publication: Variable selection for general index models via sliced inverse regression

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q480962)