Inference robust to outliers with _1-norm penalization
From MaRDI portal
Publication:5140337
Abstract: This paper considers the problem of inference in a linear regression model with outliers where the number of outliers can grow with sample size but their proportion goes to 0. We apply the square-root lasso estimator penalizing the l1-norm of a random vector which is non-zero for outliers. We derive rates of convergence and asymptotic normality. Our estimator has the same asymptotic variance as the OLS estimator in the standard linear model. This enables to build tests and confidence sets in the usual and simple manner. The proposed procedure is also computationally advantageous as it amounts to solving a convex optimization program. Overall, the suggested approach constitutes a practical robust alternative to the ordinary least squares estimator.
Recommendations
- Robust censored regression with \(\ell_1\)-norm regularization
- Robust and sparse estimators for linear regression models
- Fully efficient robust estimation, outlier detection, and variable selection via penalized regression
- Sharp non-asymptotic performance bounds for _1 and Huber robust regression estimators
- Robust sparse regression with high-breakdown value
Cites work
- scientific article; zbMATH DE number 6388313 (Why is no real title available?)
- scientific article; zbMATH DE number 3954047 (Why is no real title available?)
- scientific article; zbMATH DE number 194744 (Why is no real title available?)
- scientific article; zbMATH DE number 5251637 (Why is no real title available?)
- Estimation of High Dimensional Mean Regression in the Absence of Symmetry and Light Tail Assumptions
- High-dimensional probability. An introduction with applications in data science
- Least squares after model selection in high-dimensional sparse models
- Outlier detection using nonconvex penalized regression
- Regularization of case-specific parameters for robustness and efficiency
- Robust Estimation of a Location Parameter
- Robust covariance and scatter matrix estimation under Huber's contamination model
- Robust regression through the Huber's criterion and adaptive lasso penalty
- Robust statistics. Theory and methods (with R)
- SOCP based variance free Dantzig selector with application to robust estimation
- Scaled sparse linear regression
- Sparse least trimmed squares regression for analyzing high-dimensional large data sets
- Square-root lasso: pivotal recovery of sparse signals via conic programming
Cited in
(12)- Robust and sparse estimators for linear regression models
- Gaining Outlier Resistance With Progressive Quantiles: Fast Algorithms and Theoretical Studies
- Fast approximate L_ minimization: speeding up robust regression
- Penalized trimmed squares and a modification of support vectors for unmasking outliers in linear regression
- Fully efficient robust estimation, outlier detection, and variable selection via penalized regression
- Penalized unsupervised learning with outliers
- High-dimensional inference robust to outliers with ℓ1-norm penalization
- Sharp non-asymptotic performance bounds for _1 and Huber robust regression estimators
- Efficient and robust estimation of regression and scale parameters, with outlier detection
- Robust estimation of skew-normal parameters with application to outlier labelling
- Outlier detection using nonconvex penalized regression
- Robust censored regression with \(\ell_1\)-norm regularization
This page was built for publication: Inference robust to outliers with \(\ell_1\)-norm penalization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5140337)