A look at robustness and stability of _1-versus _0-regularization: discussion of papers by Bertsimas et al. and Hastie et al.
From MaRDI portal
Publication:2225318
DOI10.1214/20-STS809OpenAlexW3098688965MaRDI QIDQ2225318FDOQ2225318
Authors: Yuansi Chen, Armeen Taeb, Peter Bühlmann
Publication date: 8 February 2021
Published in: Statistical Science (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.ss/1605603635
Recommendations
- Robust Linear Regression via $\ell_0$ Regularization
- Robust regularization theory based on \(L_q\) \((0<q<1)\) regularization: the asymptotic distribution and variable selection consistence of solutions
- Characterization of the equivalence of robustification and regularization in linear and matrix regression
- Lipschitz behavior of the robust regularization
- The trade-off between regularity and stability in Tikhonov regularization
- Robust PCA via <inline-formula> <tex-math notation="LaTeX">$\ell _{0}$</tex-math> </inline-formula>-<inline-formula> <tex-math notation="LaTeX">$\ell _{1}$<
- Robust nonlinear regression modeling via \(L_1\)-type regularization
- Stability of \(L_1\)-norm regression under additional observations
- \(\ell _{1}\)-regularized linear regression: persistence and oracle inequalities
variable selectionlatent variablesdistributional robustnesshigh-dimensional estimationlow-rank estimation
Cites Work
- A nonlinear programming algorithm for solving semidefinite programs via low-rank factorization
- Exact spike train inference via \(\ell_{0}\) optimization
- Estimating the dimension of a model
- Heuristics of instability and stabilization in model selection
- The Adaptive Lasso and Its Oracle Properties
- Least angle regression. (With discussion)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Statistics for high-dimensional data. Methods, theory and applications.
- Bagging predictors
- On the conditions used to prove oracle results for the Lasso
- Simultaneous analysis of Lasso and Dantzig selector
- Best subset selection via a modern optimization lens
- Title not available (Why is that?)
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Regularization and Variable Selection Via the Elastic Net
- Boosting for high-dimensional linear models
- \(\ell_{0}\)-penalized maximum likelihood for sparse directed acyclic graphs
- Matching pursuits with time-frequency dictionaries
- Title not available (Why is that?)
- Relaxed Lasso
- Risk bounds for model selection via penalization
- Statistics for big data: a perspective
- Better Subset Regression Using the Nonnegative Garrote
- The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso)
- Title not available (Why is that?)
- For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution
- On Sparse Representations in Arbitrary Redundant Bases
- Greed is Good: Algorithmic Results for Sparse Approximation
- Sparse Recovery With Orthogonal Matching Pursuit Under RIP
- Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise
- 10.1162/153244303321897717
- Atomic decomposition by basis pursuit
- Algorithm for cardinality-constrained quadratic optimization
- Efficient algorithms for computing the best subset regression models for large-scale problems
- Characterization of the equivalence of robustification and regularization in linear and matrix regression
- Robust Regression and Lasso
- Sparse high-dimensional regression: exact scalable algorithms and phase transitions
- Breakthroughs in statistics. Volume I: Foundations and basic theory
- Logistic regression: from art to science
- Certifiably optimal low rank factor analysis
Cited In (3)
Uses Software
This page was built for publication: A look at robustness and stability of \(\ell_1\)-versus \(\ell_0\)-regularization: discussion of papers by Bertsimas et al. and Hastie et al.
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2225318)