On the conditions used to prove oracle results for the Lasso
DOI10.1214/09-EJS506zbMATH Open1327.62425arXiv0910.0722OpenAlexW2092058109WikidataQ98839733 ScholiaQ98839733MaRDI QIDQ1952029FDOQ1952029
Authors: Sara Van De Geer, Peter Bühlmann
Publication date: 27 May 2013
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0910.0722
Recommendations
Lassocoherencesparsityirrepresentable conditioncompatibilityrestricted eigenvaluerestricted isometry
Nonparametric estimation (62G05) Ridge regression; shrinkage estimators (Lasso) (62J07) General considerations in statistical decision theory (62C05)
Cites Work
- The Adaptive Lasso and Its Oracle Properties
- Lasso-type recovery of sparse representations for high-dimensional data
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Title not available (Why is that?)
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Sparsity oracle inequalities for the Lasso
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Decoding by Linear Programming
- Aggregation for Gaussian regression
- Extreme Eigenvalues of Toeplitz Forms and Applications to Elliptic Difference Equations
- The Dantzig selector and sparsity oracle inequalities
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- Sparse Density Estimation with ℓ1 Penalties
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- Near-ideal model selection by \(\ell _{1}\) minimization
- Shifting Inequality and Recovery of Sparse Signals
- On Recovery of Sparse Signals Via $\ell _{1}$ Minimization
- Stable Recovery of Sparse Signals and an Oracle Inequality
- Sparsity in penalized empirical risk minimization
Cited In (only showing first 100 items - show all)
- Fitting sparse linear models under the sufficient and necessary condition for model identification
- A study on tuning parameter selection for the high-dimensional lasso
- Recovery of partly sparse and dense signals
- A two-stage regularization method for variable selection and forecasting in high-order interaction model
- A systematic review on model selection in high-dimensional regression
- The variable selection by the Dantzig selector for Cox's proportional hazards model
- High dimensional regression for regenerative time-series: an application to road traffic modeling
- The smooth-Lasso and other \(\ell _{1}+\ell _{2}\)-penalized methods
- High-dimensional inference for linear model with correlated errors
- Sorted concave penalized regression
- Corrected proof of the result of 'A prediction error property of the Lasso estimator and its generalization' by Huang (2003)
- Uniform inference in high-dimensional dynamic panel data models with approximately sparse fixed effects
- Multi-stage convex relaxation for feature selection
- A general family of trimmed estimators for robust high-dimensional data analysis
- Sparsest representations and approximations of an underdetermined linear system
- Optimal Kullback-Leibler aggregation in mixture density estimation by maximum likelihood
- On the prediction loss of the Lasso in the partially labeled setting
- Gaining Outlier Resistance With Progressive Quantiles: Fast Algorithms and Theoretical Studies
- Generalized Kalman smoothing: modeling and algorithms
- Greedy variance estimation for the LASSO
- Folded concave penalized sparse linear regression: sparsity, statistical performance, and algorithmic theory for local solutions
- Oracle inequalities for high-dimensional prediction
- Slope meets Lasso: improved oracle bounds and optimality
- On the post selection inference constant under restricted isometry properties
- I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error
- The Lasso for High Dimensional Regression with a Possible Change Point
- A Critical Review of LASSO and Its Derivatives for Variable Selection Under Dependence Among Covariates
- Prediction error bounds for linear regression with the TREX
- Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation
- Simultaneous feature selection and clustering based on square root optimization
- REMI: REGRESSION WITH MARGINAL INFORMATION AND ITS APPLICATION IN GENOME-WIDE ASSOCIATION STUDIES
- Sparse high-dimensional linear regression. Estimating squared error and a phase transition
- Efficient nonconvex sparse group feature selection via continuous and discrete optimization
- Spatially-adaptive sensing in nonparametric regression
- Lasso in Infinite dimension: application to variable selection in functional multivariate linear regression
- The robust desparsified lasso and the focused information criterion for high-dimensional generalized linear models
- On the exponentially weighted aggregate with the Laplace prior
- Fast global convergence of gradient methods for high-dimensional statistical recovery
- On tight bounds for the Lasso
- Normalized and standard Dantzig estimators: two approaches
- Sparsity considerations for dependent variables
- A review of Gaussian Markov models for conditional independence
- Poisson Regression With Error Corrupted High Dimensional Features
- Penalized least squares estimation in the additive model with different smoothness for the components
- A Rice method proof of the null-space property over the Grassmannian
- High-dimensional linear model selection motivated by multiple testing
- Generalized M-estimators for high-dimensional Tobit I models
- Additive model selection
- Decomposable norm minimization with proximal-gradient homotopy algorithm
- High-dimensional regression with potential prior information on variable importance
- Estimating networks with jumps
- Approximate \(\ell_0\)-penalized estimation of piecewise-constant signals on graphs
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- Adaptive estimation of the baseline hazard function in the Cox model by model selection, with high-dimensional covariates
- A simple method for estimating interactions between a treatment and a large number of covariates
- Sign-constrained least squares estimation for high-dimensional regression
- Exponential screening and optimal rates of sparse estimation
- Local linear smoothing for sparse high dimensional varying coefficient models
- Adaptive kernel estimation of the baseline function in the Cox model with high-dimensional covariates
- Non-asymptotic oracle inequalities for the Lasso and group Lasso in high dimensional logistic model
- Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization
- Sparse semiparametric discriminant analysis
- Regularity properties for sparse regression
- An analysis of penalized interaction models
- Censored linear model in high dimensions. Penalised linear regression on high-dimensional data with left-censored response variable
- The finite sample properties of sparse M-estimators with pseudo-observations
- The Lasso problem and uniqueness
- The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso)
- Sharp support recovery from noisy random measurements by \(\ell_1\)-minimization
- On higher order isotropy conditions and lower bounds for sparse quadratic forms
- Bayesian linear regression with sparse priors
- Restricted eigenvalue properties for correlated Gaussian designs
- SLOPE is adaptive to unknown sparsity and asymptotically minimax
- High-dimensional variable screening and bias in subsequent inference, with an empirical comparison
- Prediction and estimation consistency of sparse multi-class penalized optimal scoring
- Oracle inequalities for the Lasso in the high-dimensional Aalen multiplicative intensity model
- Variable selection in partial linear regression with functional covariate
- Best subset binary prediction
- Estimation for high-dimensional linear mixed-effects models using \(\ell_1\)-penalization
- Transductive versions of the Lasso and the Dantzig selector
- PAC-Bayesian bounds for sparse regression estimation with exponential weights
- Rejoinder to the comments on: \(\ell _{1}\)-penalization for mixture regression models
- Generalization of constraints for high dimensional regression problems
- Sparse estimation via lower-order penalty optimization methods in high-dimensional linear regression
- The convex geometry of linear inverse problems
- On the uniform convergence of empirical norms and inner products, with application to causal inference
- The \(l_q\) consistency of the Dantzig selector for Cox's proportional hazards model
- Control variate selection for Monte Carlo integration
- Regularized estimation in sparse high-dimensional time series models
- Covariate Selection in High-Dimensional Generalized Linear Models With Measurement Error
- Asymptotic normality and optimalities in estimation of large Gaussian graphical models
- Weaker regularity conditions and sparse recovery in high-dimensional regression
- Goodness-of-Fit Tests for High Dimensional Linear Models
- High-dimensional additive hazards models and the lasso
- The Lasso as an \(\ell _{1}\)-ball model selection procedure
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Correlated variables in regression: clustering and sparse estimation
- The benefit of group sparsity in group inference with de-biased scaled group Lasso
- Regularized estimation of high‐dimensional vector autoregressions with weakly dependent innovations
- Finite mixture regression: a sparse variable selection by model selection for clustering
This page was built for publication: On the conditions used to prove oracle results for the Lasso
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1952029)