Robust regression through the Huber's criterion and adaptive lasso penalty
From MaRDI portal
Publication:1952217
DOI10.1214/11-EJS635zbMath1274.62467arXiv1207.6868MaRDI QIDQ1952217
Laurent Zwald, Sophie Lambert-Lacroix
Publication date: 28 May 2013
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1207.6868
Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05) Applications of statistics to actuarial sciences and financial mathematics (62P05) Robustness and adaptive procedures (parametric inference) (62F35)
Related Items
Robust methods for inferring sparse network structures, Penalized and constrained LAD estimation in fixed and high dimension, A general family of trimmed estimators for robust high-dimensional data analysis, A New Principle for Tuning-Free Huber Regression, Approximate Gibbs sampler for Bayesian Huberized lasso, Aggregated hold out for sparse linear regression with a robust loss function, The robust nearest shrunken centroids classifier for high-dimensional heavy-tailed data, Perspective functions: properties, constructions, and examples, Robust and sparse estimators for linear regression models, New Robust Variable Selection Methods for Linear Regression Models, Safe feature screening rules for the regularized Huber regression, Robust variable selection based on the random quantile LASSO, Penalised robust estimators for sparse and high-dimensional linear models, Perspective functions: proximal calculus and applications in high-dimensional statistics, Robustness and Tractability for Non-convex M-estimators, General matching quantiles M-estimation, The \(L_1\) penalized LAD estimator for high dimensional linear regression, A robust, adaptive M-estimator for pointwise estimation in heteroscedastic regression, Doubly robust weighted composite quantile regression based on SCAD‐L2, Sparse and robust estimation with ridge minimax concave penalty, Outlier-resistant high-dimensional regression modelling based on distribution-free outlier detection and tuning parameter selection, Robust censored regression with \(\ell_1\)-norm regularization, Robust variable selection and estimation via adaptive elastic net S-estimators for linear regression, Robust inference for high‐dimensional single index models, High-dimensional inference robust to outliers with ℓ1-norm penalization, Robust Multivariate Lasso Regression with Covariance Estimation, Renewable Huber estimation method for streaming datasets, Asymptotic risk and phase transition of \(l_1\)-penalized robust estimator, Inference robust to outliers with ℓ1-norm penalization, Robust variable selection and estimation in threshold regression model, Geometric median and robust estimation in Banach spaces, Perspective maximum likelihood-type estimation via proximal decomposition, Quasi-likelihood and/or robust estimation in high dimensions, Iteratively reweighted \(\ell_1\)-penalized robust regression, Low-rank elastic-net regularized multivariate Huber regression model, Regular, median and Huber cross‐validation: A computational comparison, Robust sparse regression and tuning parameter selection via the efficient bootstrap information criteria, A Robust Variable Selection tot-type Joint Generalized Linear Models via Penalizedt-type Pseudo-likelihood, Robust adaptive Lasso for variable selection, Influence Diagnostics for High-Dimensional Lasso Regression, Degrees of freedom for regularized regression with Huber loss and linear constraints, The adaptive BerHu penalty in robust regression, Asymptotic linear expansion of regularized M-estimators, Robust subset selection, Distributed adaptive Huber regression, High-dimensional robust approximated \(M\)-estimators for mean regression with asymmetric data, Tractable Bayesian Variable Selection: Beyond Normality, Robust moderately clipped LASSO for simultaneous outlier detection and variable selection, Outlier detection and robust variable selection via the penalized weighted LAD-LASSO method, A Lasso-type Robust Variable Selection for Time-Course Microarray Data
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- Introduction to robust and quasi-robust statistical methods
- Cox's regression model for counting processes: A large sample study
- Two-stage Huber estimation
- Estimating the dimension of a model
- A distribution-free theory of nonparametric regression
- Asymptotics for Lasso-type estimators.
- On the asymptotics of constrained \(M\)-estimation
- Weak convergence and empirical processes. With applications to statistics
- High-dimensional graphs and variable selection with the Lasso
- Graph Implementations for Nonsmooth Convex Programs
- Unified LASSO Estimation by Least Squares Approximation
- Preservation of Convergence of Convex Sets and Functions in Finite Dimensions
- Two Robust Alternatives to Least-Squares Regression
- Asymptotic Statistics
- Alternatives to the Median Absolute Deviation
- Variational Analysis
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Using SeDuMi 1.02, A Matlab toolbox for optimization over symmetric cones
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- Convex Analysis
- Asymptotic Properties of Non-Linear Least Squares Estimators
- Robust Statistics