Iterative feature selection in least square regression estimation

From MaRDI portal
Publication:731450

DOI10.1214/07-AIHP106zbMATH Open1206.62067arXivmath/0511299OpenAlexW2095350026MaRDI QIDQ731450FDOQ731450


Authors: Pierre Alquier Edit this on Wikidata


Publication date: 7 October 2009

Published in: Annales de l'Institut Henri Poincaré. Probabilités et Statistiques (Search for Journal in Brave)

Abstract: In this paper, we focus on regression estimation in both the inductive and the transductive case. We assume that we are given a set of features (which can be a base of functions, but not necessarily). We begin by giving a deviation inequality on the risk of an estimator in every model defined by using a single feature. These models are too simple to be useful by themselves, but we then show how this result motivates an iterative algorithm that performs feature selection in order to build a suitable estimator. We prove that every selected feature actually improves the performance of the estimator. We give all the estimators and results at first in the inductive case, which requires the knowledge of the distribution of the design, and then in the transductive case, in which we do not need to know this distribution.


Full work available at URL: https://arxiv.org/abs/math/0511299




Recommendations




Cites Work


Cited In (8)





This page was built for publication: Iterative feature selection in least square regression estimation

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q731450)