Transductive versions of the Lasso and the Dantzig selector
DOI10.1016/J.JSPI.2012.03.020zbMATH Open1428.62312arXiv0906.0652OpenAlexW2964240626MaRDI QIDQ447611FDOQ447611
Authors: Pierre Alquier, Mohamed Hebiri
Publication date: 4 September 2012
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0906.0652
Recommendations
- On the prediction loss of the Lasso in the partially labeled setting
- Simultaneous analysis of Lasso and Dantzig selector
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- scientific article; zbMATH DE number 2243369
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
Lassovariable selectionhigh-dimensional datasparsityhigh-dimensional regression estimationtransduction
Nonparametric regression and quantile regression (62G08) Linear regression; mixed models (62J05) Ridge regression; shrinkage estimators (Lasso) (62J07)
Cites Work
- High-dimensional additive modeling
- Estimating the dimension of a model
- The Adaptive Lasso and Its Oracle Properties
- Least angle regression. (With discussion)
- Pathwise coordinate optimization
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Title not available (Why is that?)
- Title not available (Why is that?)
- Lasso-type recovery of sparse representations for high-dimensional data
- On the conditions used to prove oracle results for the Lasso
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Text classification from labeled and unlabeled documents using EM
- DASSO: Connections Between the Dantzig Selector and Lasso
- Title not available (Why is that?)
- On the Non-Negative Garrotte Estimator
- Aggregation for Gaussian regression
- Consistency of the group Lasso and multiple kernel learning
- The Dantzig selector and sparsity oracle inequalities
- Some theoretical results on the grouped variables Lasso
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- Title not available (Why is that?)
- On transductive support vector machines
- Smoothing \(\ell_1\)-penalized estimators for high-dimensional time-course data
- Sparse recovery in convex hulls via entropy penalization
- Aggregation by Exponential Weighting and Sharp Oracle Inequalities
- Lasso, iterative feature selection and the correlation selector: oracle inequalities and numerical performances
- Generalization of constraints for high dimensional regression problems
Cited In (3)
Uses Software
This page was built for publication: Transductive versions of the Lasso and the Dantzig selector
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q447611)