Iterative feature selection in least square regression estimation

From MaRDI portal




Abstract: In this paper, we focus on regression estimation in both the inductive and the transductive case. We assume that we are given a set of features (which can be a base of functions, but not necessarily). We begin by giving a deviation inequality on the risk of an estimator in every model defined by using a single feature. These models are too simple to be useful by themselves, but we then show how this result motivates an iterative algorithm that performs feature selection in order to build a suitable estimator. We prove that every selected feature actually improves the performance of the estimator. We give all the estimators and results at first in the inductive case, which requires the knowledge of the distribution of the design, and then in the transductive case, in which we do not need to know this distribution.









This page was built for publication: Iterative feature selection in least square regression estimation

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q731450)