Iterative feature selection in least square regression estimation
DOI10.1214/07-AIHP106zbMATH Open1206.62067arXivmath/0511299OpenAlexW2095350026MaRDI QIDQ731450FDOQ731450
Authors: Pierre Alquier
Publication date: 7 October 2009
Published in: Annales de l'Institut Henri Poincaré. Probabilités et Statistiques (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0511299
Recommendations
- Lasso, iterative feature selection and the correlation selector: oracle inequalities and numerical performances
- An algorithm for iterative selection of blocks of features
- Least angle regression. (With discussion)
- Model selection for regularized least-squares algorithm in learning theory
- Least square regression with \(l^{p}\)-coefficient regularization
confidence regionsstatistical learningsupport vector machinesregression estimationthresholding methods
Nonparametric regression and quantile regression (62G08) Learning and adaptive systems in artificial intelligence (68T05) Nonparametric tolerance and confidence regions (62G15)
Cites Work
- Ideal spatial adaptation by wavelet shrinkage
- Title not available (Why is that?)
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Density estimation by wavelet thresholding
- An introduction to support vector machines and other kernel-based learning methods.
- Title not available (Why is that?)
- Approximation and learning by greedy algorithms
- Wavelets, approximation, and statistical applications
- Regression in random design and warped wavelets
- Aggregating regression procedures to improve performance
- Statistical learning theory and stochastic optimization. Ecole d'Eté de Probabilitiés de Saint-Flour XXXI -- 2001.
- Optimal aggregation of classifiers in statistical learning.
- PAC-Bayesian Generalisation Error Bounds for Gaussian Process Classification
- Recursive aggregation of estimators by the mirror descent algorithm with averaging
- Aggregated estimators and empirical complexity for least square regression
- Symmetrization approach to concentration inequalities for empirical processes.
Cited In (8)
- Iterative selection using orthogonal regression techniques
- An algorithm for iterative selection of blocks of features
- Fast learning rates in statistical inference through aggregation
- Density estimation with quadratic loss: a confidence intervals method
- Tight conditions for consistency of variable selection in the context of high dimensionality
- Title not available (Why is that?)
- Lasso, iterative feature selection and the correlation selector: oracle inequalities and numerical performances
- Title not available (Why is that?)
This page was built for publication: Iterative feature selection in least square regression estimation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q731450)