L 0 -regularization for high-dimensional regression with corrupted data
From MaRDI portal
Publication:6082450
DOI10.1080/03610926.2022.2076125MaRDI QIDQ6082450
No author found.
Publication date: 29 November 2023
Published in: Communications in Statistics - Theory and Methods (Search for Journal in Brave)
model selectionpolynomial algorithmmeasurement errors\(L_0\)-regularizationnearest positive semi-definite matrix projection
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Measurement error in Lasso: impact and likelihood bias correction
- Missing values: sparse inverse covariance estimation and an extension to sparse regression
- Best subset selection via a modern optimization lens
- Variable selection in measurement error models
- Sparse recovery under matrix uncertainty
- CoCoLasso for high-dimensional error-in-variables regression
- High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity
- A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem
- Balanced estimation for high-dimensional measurement error models
- Simultaneous analysis of Lasso and Dantzig selector
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- A polynomial algorithm for best-subset selection problem
- Asymptotic properties for combined L1 and concave regularization
- High Dimensional Thresholded Regression and Shrinkage Effect
- A general theory of concave regularization for high-dimensional sparse estimation problems
This page was built for publication: L 0 -regularization for high-dimensional regression with corrupted data