The L₁ penalized LAD estimator for high dimensional linear regression
From MaRDI portal
(Redirected from Publication:391806)
Abstract: In this paper, the high-dimensional sparse linear regression model is considered, where the overall number of variables is larger than the number of observations. We investigate the L1 penalized least absolute deviation method. Different from most of other methods, the L1 penalized LAD method does not need any knowledge of standard deviation of the noises or any moment assumptions of the noises. Our analysis shows that the method achieves near oracle performance, i.e. with large probability, the L2 norm of the estimation error is of order . The result is true for a wide range of noise distributions, even for the Cauchy distribution. Numerical results are also presented.
Recommendations
- A new penalized least absolute deviation model for high dimensional sparse linear regression and an efficient sequential linear programming algorithm
- SCAD-penalized least absolute deviation regression in high-dimensional models
- Fused Lasso penalized least absolute deviation estimator for high dimensional linear regression
- Asymptotic analysis of high-dimensional LAD regression with Lasso smoother
- The linearized alternating direction method of multipliers for sparse group LAD model
Cites work
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- Asymptotic Analysis of Robust LASSOs in the Presence of Noise With Large Variance
- Asymptotic Theory of Least Absolute Error Regression
- Asymptotic analysis of high-dimensional LAD regression with Lasso smoother
- Compressed sensing
- Compressive sampling
- Decoding by Linear Programming
- Limit Theorems for Moderate Deviation Probabilities
- New Bounds for Restricted Isometry Constants
- New volume ratio properties for convex symmetric bodies in \({\mathbb{R}}^ n\)
- On the consistency of feature selection using greedy least squares regression
- Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise
- Probability Inequalities for Sums of Bounded Random Variables
- Quantile regression.
- Robust Statistics
- Robust regression through the Huber's criterion and adaptive lasso penalty
- Shifting Inequality and Recovery of Sparse Signals
- Simultaneous analysis of Lasso and Dantzig selector
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- The Gaussian hare and the Laplacian tortoise: computability of squared-error versus absolute-error estimators. With comments by Ronald A. Thisted and M. R. Osborne and a rejoinder by the authors
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
Cited in
(66)- Sure independence screening for analyzing supersaturated designs
- The adaptive L1-penalized LAD regression for partially linear single-index models
- Robust sparse regression by modeling noise as a mixture of Gaussians
- Fast optimization methods for high-dimensional row-sparse multivariate quantile linear regression
- Multi-block alternating direction method of multipliers for ultrahigh dimensional quantile fused regression
- Matrix recovery from nonconvex regularized least absolute deviations
- ADMM for High-Dimensional Sparse Penalized Quantile Regression
- Gradient projection Newton pursuit for sparsity constrained optimization
- A fast and effective algorithm for sparse linear regression with \(\ell_p\)-norm data fidelity and elastic net regularization
- Asymptotic risk and phase transition of \(l_1\)-penalized robust estimator
- Adaptive iterative hard thresholding for least absolute deviation problems with sparsity constraints
- Quantile regression for single-index-coefficient regression models
- A smoothing iterative method for quantile regression with nonconvex \(\ell_p\) penalty
- Robust error density estimation in ultrahigh dimensional sparse linear model
- Double fused Lasso penalized LAD for matrix regression
- Penalized and constrained LAD estimation in fixed and high dimension
- Overview of robust variable selection methods for high-dimensional linear regression model
- Adaptive robust variable selection
- Fused Lasso penalized least absolute deviation estimator for high dimensional linear regression
- Iterative reweighted methods for \(\ell _1-\ell _p\) minimization
- Oracle estimation of a change point in high-dimensional quantile regression
- Penalised robust estimators for sparse and high-dimensional linear models
- Asymptotic analysis of high-dimensional LAD regression with Lasso smoother
- Wild bootstrap inference for penalized quantile regression for longitudinal data
- The linearized alternating direction method of multipliers for sparse group LAD model
- A tuning-free robust and efficient approach to high-dimensional regression
- High-dimensional volatility matrix estimation with cross-sectional dependent and heavy-tailed microstructural noise
- A null-space-based weightedl1minimization approach to compressed sensing
- Sparse solutions of a class of constrained optimization problems
- Fast Algorithms for LS and LAD-Collaborative Regression
- An error bound for \(L_1\)-norm support vector machine coefficients in ultra-high dimension
- Incorporating Graphical Structure of Predictors in Sparse Quantile Regression
- The robust nearest shrunken centroids classifier for high-dimensional heavy-tailed data
- Sparse quantile regression
- Adaptive LASSO model selection in a multiphase quantile regression
- Adaptive Huber Regression
- Low rank matrix recovery with adversarial sparse noise
- scientific article; zbMATH DE number 6472991 (Why is no real title available?)
- A projection-based conditional dependence measure with applications to high-dimensional undirected graphical models
- A novel robust estimation for high-dimensional precision matrices
- A new penalized least absolute deviation model for high dimensional sparse linear regression and an efficient sequential linear programming algorithm
- \(\ell_1-\alpha\ell_2\) minimization methods for signal and image reconstruction with impulsive noise removal
- A descent algorithm for constrained LAD-Lasso estimation with applications in portfolio selection
- Low rank matrix recovery with impulsive noise
- High-dimensional robust regression with \(L_q\)-loss functions
- A new active zero set descent algorithm for least absolute deviation with generalized LASSO penalty
- Robust change point detection method via adaptive LAD-Lasso
- Variable selection and parameter estimation via WLAD-SCAD with a diverging number of parameters
- Group penalized quantile regression
- Scale calibration for high-dimensional robust regression
- Pivotal estimation via square-root lasso in nonparametric regression
- A semi-parametric approach to feature selection in high-dimensional linear regression models
- High-dimensional robust approximated M-estimators for mean regression with asymmetric data
- A proximal dual semismooth Newton method for zero-norm penalized quantile regression estimator
- Robust post-selection inference of high-dimensional mean regression with heavy-tailed asymmetric or heteroskedastic errors
- scientific article; zbMATH DE number 6982301 (Why is no real title available?)
- Faster subgradient methods for functions with Hölderian growth
- Heterogeneous robust estimation with the mixed penalty in high-dimensional regression model
- Analysis of global and local optima of regularized quantile regression in high dimensions: a subgradient approach
- A descent method for least absolute deviation Lasso problems
- scientific article; zbMATH DE number 7306923 (Why is no real title available?)
- Adaptive elastic net-penalized quantile regression for variable selection
- An efficient semismooth Newton method for adaptive sparse signal recovery problems
- High-Dimensional Learning Under Approximate Sparsity with Applications to Nonsmooth Estimation and Regularized Neural Networks
- Robust moderately clipped LASSO for simultaneous outlier detection and variable selection
- SCAD-penalized least absolute deviation regression in high-dimensional models
This page was built for publication: The \(L_1\) penalized LAD estimator for high dimensional linear regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q391806)