On the consistency of feature selection using greedy least squares regression
From MaRDI portal
Publication:2880890
zbMATH Open1235.62096MaRDI QIDQ2880890FDOQ2880890
Authors: Tong Zhang
Publication date: 17 April 2012
Published in: Journal of Machine Learning Research (JMLR) (Search for Journal in Brave)
Full work available at URL: http://www.jmlr.org/papers/v10/zhang09a.html
Recommendations
Cited In (32)
- A novel T-S fuzzy systems identification with block structured sparse representation
- Greedy algorithms for prediction
- An Interactive Greedy Approach to Group Sparsity in High Dimensions
- Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization
- Rank-Based Greedy Model Averaging for High-Dimensional Survival Data
- Fitting very large sparse Gaussian graphical models
- Sparse estimation in Ising model via penalized Monte Carlo methods
- Inconsistency guided robust attribute reduction
- Extending greedy feature selection algorithms to multiple solutions
- Convergence rate of the semi-supervised greedy algorithm
- Tight conditions for consistency of variable selection in the context of high dimensionality
- Selection of time instants and intervals with support vector regression for multivariate functional data
- Variational approximation for heteroscedastic linear models and matching pursuit algorithms
- Blessing of massive scale: spatial graphical model estimation with a total cardinality constraint approach
- Sparse regression and support recovery with \(\mathbb{L}_2\)-boosting algorithms
- Regularized greedy column subset selection
- \(l_{0}\)-norm based structural sparse least square regression for feature selection
- Simultaneous variable selection and component selection for regression density estimation with mixtures of heteroscedastic experts
- Subspace learning for unsupervised feature selection via matrix factorization
- Approximate submodularity and its applications: subset selection, sparse approximation and dictionary selection
- Stability Selection
- Feature Selection for Ridge Regression with Provable Guarantees
- Adaptive multi-penalty regularization based on a generalized Lasso path
- Structured, sparse regression with application to HIV drug resistance
- Reprint of: A forward-backward greedy approach for sparse multiscale learning
- Sparse signals recovery from noisy measurements by orthogonal matching pursuit
- A forward-backward greedy approach for sparse multiscale learning
- Oracle inequalities for the lasso in the Cox model
- The \(L_1\) penalized LAD estimator for high dimensional linear regression
- Multi-class heterogeneous domain adaptation
- Forward stability and model path selection
- Correction to: ``Efficient feature selection using shrinkage estimators
This page was built for publication: On the consistency of feature selection using greedy least squares regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2880890)