Lasso-type recovery of sparse representations for high-dimensional data
DOI10.1214/07-AOS582zbMATH Open1155.62050arXiv0806.0145OpenAlexW3106266785WikidataQ105584243 ScholiaQ105584243MaRDI QIDQ1002157FDOQ1002157
Authors: Nicolai Meinshausen, Bin Yu
Publication date: 25 February 2009
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0806.0145
Recommendations
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- scientific article; zbMATH DE number 5957408
- Necessary and sufficient conditions for variable selection consistency of the Lasso in high dimensions
- Adaptive Lasso for sparse high-dimensional regression models
- On the sensitivity of the Lasso to the number of predictor variables
Asymptotic properties of parametric estimators (62F12) Nonparametric regression and quantile regression (62G08) Linear regression; mixed models (62J05) Statistical ranking and selection procedures (62F07) Ridge regression; shrinkage estimators (Lasso) (62J07)
Cites Work
- The Adaptive Lasso and Its Oracle Properties
- Least angle regression. (With discussion)
- Title not available (Why is that?)
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Title not available (Why is that?)
- Model Selection and Estimation in Regression with Grouped Variables
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Asymptotics for Lasso-type estimators.
- Sparsity oracle inequalities for the Lasso
- A theory for multiresolution signal decomposition: the wavelet representation
- The Group Lasso for Logistic Regression
- Asymptotics of sample eigenstructure for a large dimensional spiked covariance model
- Optimally sparse representation in general (nonorthogonal) dictionaries via â 1 minimization
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Relaxed Lasso
- Local operator theory, random matrices and Banach spaces.
- Title not available (Why is that?)
- Decoding by Linear Programming
- Stable recovery of sparse overcomplete representations in the presence of noise
- Just relax: convex programming methods for identifying sparse signals in noise
- Aggregation for Gaussian regression
- For most large underdetermined systems of linear equations the minimal đ1ânorm solution is also the sparsest solution
- Adaptive Lasso for Cox's proportional hazards model
- Recovery of Exact Sparse Representations in the Presence of Bounded Noise
- Greed is Good: Algorithmic Results for Sparse Approximation
- Sparse representations in unions of bases
- Using circulant symmetry to model featureless objects
- THE RESOLUTION OF CLOSELY ADJACENT SPECTRAL LINES
Cited In (only showing first 100 items - show all)
- Discussion of: ``Grouping strategies and thresholding for high dimension linear models
- Recovery of partly sparse and dense signals
- The smooth-Lasso and other \(\ell _{1}+\ell _{2}\)-penalized methods
- Regularization and the small-ball method. I: Sparse recovery
- Model selection consistency of Lasso for empirical data
- Generalized Kalman smoothing: modeling and algorithms
- Sharp support recovery from noisy random measurements by \(\ell_1\)-minimization
- Greedy variance estimation for the LASSO
- Penalised robust estimators for sparse and high-dimensional linear models
- High dimensional single index models
- Robust machine learning by median-of-means: theory and practice
- Minimization of $L_1$ Over $L_2$ for Sparse Signal Recovery with Convergence Guarantee
- The Lasso for High Dimensional Regression with a Possible Change Point
- Convex biclustering
- A Critical Review of LASSO and Its Derivatives for Variable Selection Under Dependence Among Covariates
- Transductive versions of the Lasso and the Dantzig selector
- Variable selection in censored quantile regression with high dimensional data
- Sparse Recovery With Unknown Variance: A LASSO-Type Approach
- Generalization of constraints for high dimensional regression problems
- PAC-Bayesian estimation and prediction in sparse additive models
- A unified penalized method for sparse additive quantile models: an RKHS approach
- High-dimensional sparse portfolio selection with nonnegative constraint
- Focused vector information criterion model selection and model averaging regression with missing response
- Searching for minimal optimal neural networks
- Least squares approximation with a diverging number of parameters
- SCADâpenalized quantile regression for highâdimensional data analysis and variable selection
- Recovery of seismic wavefields by an \(l_{q}\)-norm constrained regularization method
- Lasso, fractional norm and structured sparse estimation using a Hadamard product parametrization
- Approximate Spectral Gaps for Markov Chain Mixing Times in High Dimensions
- Bayesian high-dimensional screening via MCMC
- Estimation of high-dimensional partially-observed discrete Markov random fields
- Information-Based Optimal Subdata Selection for Big Data Linear Regression
- High-dimensional Bayesian inference in nonparametric additive models
- Discussion: One-step sparse estimates in nonconcave penalized likelihood models
- A doubly sparse approach for group variable selection
- MAP model selection in Gaussian regression
- A generalized elastic net regularization with smoothed \(\ell _{q}\) penalty for sparse vector recovery
- On the sparsity of Lasso minimizers in sparse data recovery
- Nonparametric and high-dimensional functional graphical models
- A review of Gaussian Markov models for conditional independence
- Sparse Laplacian shrinkage with the graphical Lasso estimator for regression problems
- Monte Carlo simulation for Lasso-type problems by estimator augmentation
- Estimation of average treatment effects with panel data: asymptotic theory and implementation
- Consistent tuning parameter selection in high dimensional sparse linear regression
- An Augmented Lagrangian Method for $\ell_{1}$-Regularized Optimization Problems with Orthogonality Constraints
- A new hybrid \(l_p\)-\(l_2\) model for sparse solutions with applications to image processing
- Title not available (Why is that?)
- Generalized M-estimators for high-dimensional Tobit I models
- Cross-Validation With Confidence
- Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression
- Identifying small mean-reverting portfolios
- Learning sparse classifiers with difference of convex functions algorithms
- Adaptive estimation of the baseline hazard function in the Cox model by model selection, with high-dimensional covariates
- Estimation and variable selection in partial linear single index models with error-prone linear covariates
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- On the conditions used to prove oracle results for the Lasso
- Grouping strategies and thresholding for high dimensional linear models
- Regression on manifolds: estimation of the exterior derivative
- Lazy lasso for local regression
- Least squares after model selection in high-dimensional sparse models
- Sign-constrained least squares estimation for high-dimensional regression
- Adaptive Dantzig density estimation
- Influence measures and stability for graphical models
- Non-asymptotic oracle inequalities for the Lasso and group Lasso in high dimensional logistic model
- Sub-optimality of some continuous shrinkage priors
- Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization
- Simultaneous analysis of Lasso and Dantzig selector
- Selection of variables and dimension reduction in high-dimensional non-parametric regression
- High-dimensional generalized linear models and the lasso
- An analysis of penalized interaction models
- The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso)
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
- Graphical-model based high dimensional generalized linear models
- Minimax-optimal nonparametric regression in high dimensions
- A two-step method for estimating high-dimensional Gaussian graphical models
- SLOPE is adaptive to unknown sparsity and asymptotically minimax
- Estimation and variable selection with exponential weights
- High-dimensional variable screening and bias in subsequent inference, with an empirical comparison
- Near-ideal model selection by \(\ell _{1}\) minimization
- A comparison of the Lasso and marginal regression
- Variable selection in high-dimensional partly linear additive models
- Estimation for high-dimensional linear mixed-effects models using \(\ell_1\)-penalization
- \(\ell _{1}\)-regularized linear regression: persistence and oracle inequalities
- A new perspective on least squares under convex constraint
- On the residual empirical process based on the ALASSO in high dimensions and its functional oracle property
- Honest variable selection in linear and logistic regression models via \(\ell _{1}\) and \(\ell _{1}+\ell _{2}\) penalization
- On the asymptotic properties of the group lasso estimator for linear models
- Thresholding-based iterative selection procedures for model selection and shrinkage
- AIC for the Lasso in generalized linear models
- Consistency of Bayesian linear model selection with a growing number of parameters
- \(L_1\)-regularized least squares for support recovery of high dimensional single index models with Gaussian designs
- Two tales of variable selection for high dimensional regression: Screening and model building
- Title not available (Why is that?)
- Nearly unbiased variable selection under minimax concave penalty
- A Cluster Elastic Net for Multivariate Regression
- Structured estimation for the nonparametric Cox model
- Regression analysis of locality preserving projections via sparse penalty
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- Stability Selection
- Extensions of stability selection using subsamples of observations and covariates
This page was built for publication: Lasso-type recovery of sparse representations for high-dimensional data
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1002157)