The L₁ penalized LAD estimator for high dimensional linear regression
From MaRDI portal
Publication:391806
DOI10.1016/J.JMVA.2013.04.001zbMATH Open1279.62144arXiv1202.6347OpenAlexW1978901787MaRDI QIDQ391806FDOQ391806
Authors: Lie Wang
Publication date: 13 January 2014
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Abstract: In this paper, the high-dimensional sparse linear regression model is considered, where the overall number of variables is larger than the number of observations. We investigate the L1 penalized least absolute deviation method. Different from most of other methods, the L1 penalized LAD method does not need any knowledge of standard deviation of the noises or any moment assumptions of the noises. Our analysis shows that the method achieves near oracle performance, i.e. with large probability, the L2 norm of the estimation error is of order . The result is true for a wide range of noise distributions, even for the Cauchy distribution. Numerical results are also presented.
Full work available at URL: https://arxiv.org/abs/1202.6347
Recommendations
- A new penalized least absolute deviation model for high dimensional sparse linear regression and an efficient sequential linear programming algorithm
- SCAD-penalized least absolute deviation regression in high-dimensional models
- Fused Lasso penalized least absolute deviation estimator for high dimensional linear regression
- Asymptotic analysis of high-dimensional LAD regression with Lasso smoother
- The linearized alternating direction method of multipliers for sparse group LAD model
Cites Work
- The Gaussian hare and the Laplacian tortoise: computability of squared-error versus absolute-error estimators. With comments by Ronald A. Thisted and M. R. Osborne and a rejoinder by the authors
- Title not available (Why is that?)
- Simultaneous analysis of Lasso and Dantzig selector
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Quantile regression.
- Robust Statistics
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
- Probability Inequalities for Sums of Bounded Random Variables
- Compressive sampling
- Robust regression through the Huber's criterion and adaptive lasso penalty
- Decoding by Linear Programming
- Asymptotic Theory of Least Absolute Error Regression
- Compressed sensing
- New volume ratio properties for convex symmetric bodies in \({\mathbb{R}}^ n\)
- On the consistency of feature selection using greedy least squares regression
- Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise
- Limit Theorems for Moderate Deviation Probabilities
- Shifting Inequality and Recovery of Sparse Signals
- New Bounds for Restricted Isometry Constants
- Asymptotic analysis of high-dimensional LAD regression with Lasso smoother
- Asymptotic Analysis of Robust LASSOs in the Presence of Noise With Large Variance
Cited In (66)
- Robust sparse regression by modeling noise as a mixture of Gaussians
- Gradient projection Newton pursuit for sparsity constrained optimization
- Asymptotic risk and phase transition of \(l_1\)-penalized robust estimator
- Adaptive iterative hard thresholding for least absolute deviation problems with sparsity constraints
- Quantile regression for single-index-coefficient regression models
- A smoothing iterative method for quantile regression with nonconvex \(\ell_p\) penalty
- Robust error density estimation in ultrahigh dimensional sparse linear model
- Double fused Lasso penalized LAD for matrix regression
- Penalized and constrained LAD estimation in fixed and high dimension
- Adaptive robust variable selection
- Fused Lasso penalized least absolute deviation estimator for high dimensional linear regression
- Oracle estimation of a change point in high-dimensional quantile regression
- Wild bootstrap inference for penalized quantile regression for longitudinal data
- Iterative reweighted methods for \(\ell _1-\ell _p\) minimization
- Penalised robust estimators for sparse and high-dimensional linear models
- Asymptotic analysis of high-dimensional LAD regression with Lasso smoother
- A tuning-free robust and efficient approach to high-dimensional regression
- The linearized alternating direction method of multipliers for sparse group LAD model
- Sparse solutions of a class of constrained optimization problems
- A null-space-based weightedl1minimization approach to compressed sensing
- An error bound for \(L_1\)-norm support vector machine coefficients in ultra-high dimension
- The robust nearest shrunken centroids classifier for high-dimensional heavy-tailed data
- Sparse quantile regression
- Low rank matrix recovery with adversarial sparse noise
- Adaptive LASSO model selection in a multiphase quantile regression
- Adaptive Huber Regression
- Title not available (Why is that?)
- A new penalized least absolute deviation model for high dimensional sparse linear regression and an efficient sequential linear programming algorithm
- \(\ell_1-\alpha\ell_2\) minimization methods for signal and image reconstruction with impulsive noise removal
- A projection-based conditional dependence measure with applications to high-dimensional undirected graphical models
- A descent algorithm for constrained LAD-Lasso estimation with applications in portfolio selection
- Low rank matrix recovery with impulsive noise
- High-dimensional robust regression with \(L_q\)-loss functions
- Robust change point detection method via adaptive LAD-Lasso
- Variable selection and parameter estimation via WLAD-SCAD with a diverging number of parameters
- Group penalized quantile regression
- Scale calibration for high-dimensional robust regression
- A semi-parametric approach to feature selection in high-dimensional linear regression models
- Pivotal estimation via square-root lasso in nonparametric regression
- A proximal dual semismooth Newton method for zero-norm penalized quantile regression estimator
- High-dimensional robust approximated \(M\)-estimators for mean regression with asymmetric data
- Title not available (Why is that?)
- Robust post-selection inference of high-dimensional mean regression with heavy-tailed asymmetric or heteroskedastic errors
- Faster subgradient methods for functions with Hölderian growth
- Title not available (Why is that?)
- A descent method for least absolute deviation Lasso problems
- Adaptive elastic net-penalized quantile regression for variable selection
- An efficient semismooth Newton method for adaptive sparse signal recovery problems
- High-Dimensional Learning Under Approximate Sparsity with Applications to Nonsmooth Estimation and Regularized Neural Networks
- SCAD-penalized least absolute deviation regression in high-dimensional models
- Robust moderately clipped LASSO for simultaneous outlier detection and variable selection
- Sure independence screening for analyzing supersaturated designs
- The adaptive L1-penalized LAD regression for partially linear single-index models
- Fast optimization methods for high-dimensional row-sparse multivariate quantile linear regression
- Multi-block alternating direction method of multipliers for ultrahigh dimensional quantile fused regression
- Matrix recovery from nonconvex regularized least absolute deviations
- ADMM for High-Dimensional Sparse Penalized Quantile Regression
- A fast and effective algorithm for sparse linear regression with \(\ell_p\)-norm data fidelity and elastic net regularization
- Overview of robust variable selection methods for high-dimensional linear regression model
- High-dimensional volatility matrix estimation with cross-sectional dependent and heavy-tailed microstructural noise
- Fast Algorithms for LS and LAD-Collaborative Regression
- Incorporating Graphical Structure of Predictors in Sparse Quantile Regression
- A novel robust estimation for high-dimensional precision matrices
- A new active zero set descent algorithm for least absolute deviation with generalized LASSO penalty
- Heterogeneous robust estimation with the mixed penalty in high-dimensional regression model
- Analysis of global and local optima of regularized quantile regression in high dimensions: a subgradient approach
This page was built for publication: The \(L_1\) penalized LAD estimator for high dimensional linear regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q391806)