On the consistency of feature selection using greedy least squares regression
From MaRDI portal
Publication:2880890
Recommendations
Cited in
(32)- A novel T-S fuzzy systems identification with block structured sparse representation
- Greedy algorithms for prediction
- An Interactive Greedy Approach to Group Sparsity in High Dimensions
- Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization
- Rank-Based Greedy Model Averaging for High-Dimensional Survival Data
- Fitting very large sparse Gaussian graphical models
- Sparse estimation in Ising model via penalized Monte Carlo methods
- Extending greedy feature selection algorithms to multiple solutions
- Convergence rate of the semi-supervised greedy algorithm
- Tight conditions for consistency of variable selection in the context of high dimensionality
- Inconsistency guided robust attribute reduction
- Variational approximation for heteroscedastic linear models and matching pursuit algorithms
- Selection of time instants and intervals with support vector regression for multivariate functional data
- Blessing of massive scale: spatial graphical model estimation with a total cardinality constraint approach
- Sparse regression and support recovery with \(\mathbb{L}_2\)-boosting algorithms
- \(l_{0}\)-norm based structural sparse least square regression for feature selection
- Regularized greedy column subset selection
- Simultaneous variable selection and component selection for regression density estimation with mixtures of heteroscedastic experts
- Subspace learning for unsupervised feature selection via matrix factorization
- Approximate submodularity and its applications: subset selection, sparse approximation and dictionary selection
- Stability Selection
- Adaptive multi-penalty regularization based on a generalized Lasso path
- Feature Selection for Ridge Regression with Provable Guarantees
- Structured, sparse regression with application to HIV drug resistance
- Reprint of: A forward-backward greedy approach for sparse multiscale learning
- Sparse signals recovery from noisy measurements by orthogonal matching pursuit
- A forward-backward greedy approach for sparse multiscale learning
- Oracle inequalities for the lasso in the Cox model
- The \(L_1\) penalized LAD estimator for high dimensional linear regression
- Multi-class heterogeneous domain adaptation
- Correction to: ``Efficient feature selection using shrinkage estimators
- Forward stability and model path selection
This page was built for publication: On the consistency of feature selection using greedy least squares regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2880890)