Adaptive robust variable selection
From MaRDI portal
Abstract: Heavy-tailed high-dimensional data are commonly encountered in various scientific fields and pose great challenges to modern statistical analysis. A natural procedure to address this problem is to use penalized quantile regression with weighted -penalty, called weighted robust Lasso (WR-Lasso), in which weights are introduced to ameliorate the bias problem induced by the -penalty. In the ultra-high dimensional setting, where the dimensionality can grow exponentially with the sample size, we investigate the model selection oracle property and establish the asymptotic normality of the WR-Lasso. We show that only mild conditions on the model error distribution are needed. Our theoretical results also reveal that adaptive choice of the weight vector is essential for the WR-Lasso to enjoy these nice asymptotic properties. To make the WR-Lasso practically feasible, we propose a two-step procedure, called adaptive robust Lasso (AR-Lasso), in which the weight vector in the second step is constructed based on the -penalized quantile regression estimate from the first step. This two-step procedure is justified theoretically to possess the oracle property and the asymptotic normality. Numerical studies demonstrate the favorable finite-sample performance of the AR-Lasso.
Recommendations
- Robust adaptive Lasso for variable selection
- Penalized composite quasi-likelihood for ultrahigh dimensional variable selection
- Robust signed-rank variable selection in linear regression
- Robust variable selection in high-dimensional varying coefficient models based on weighted composite quantile regression
- Adaptive penalized quantile regression for high dimensional data
Cites work
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A unified approach to model selection and sparse recovery using regularized least squares
- Adaptive robust variable selection
- Composite quantile regression and the oracle model selection theory
- High dimensional covariance matrix estimation using a factor model
- Nearly unbiased variable selection under minimax concave penalty
- Nonconcave Penalized Likelihood With NP-Dimensionality
- Nonconcave penalized likelihood with a diverging number of parameters.
- One-step sparse estimates in nonconcave penalized likelihood models
- Quantile Regression for Analyzing Heterogeneity in Ultra-High Dimension
- Quasi-likelihood and/or robust estimation in high dimensions
- Regularization in statistics
- Simultaneous analysis of Lasso and Dantzig selector
- Statistics for high-dimensional data. Methods, theory and applications.
- The Adaptive Lasso and Its Oracle Properties
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- The \(L_1\) penalized LAD estimator for high dimensional linear regression
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Variable selection in quantile regression
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
Cited in
(only showing first 100 items - show all)- Penalized Quantile Regression for Distributed Big Data Using the Slack Variable Representation
- A proximal dual semismooth Newton method for zero-norm penalized quantile regression estimator
- Ultra-high dimensional longitudinal quantile feature screening based on modified Cholesky decomposition
- \(\ell_1\)-regularization of high-dimensional time-series models with non-Gaussian and heteroskedastic errors
- Robust post-selection inference of high-dimensional mean regression with heavy-tailed asymmetric or heteroskedastic errors
- Robust regularization theory based on \(L_q\) \((0<q<1)\) regularization: the asymptotic distribution and variable selection consistence of solutions
- Sparse and robust estimation with ridge minimax concave penalty
- Robust Lasso and its applications in healthcare data
- Estimation of sparse covariance matrix via non-convex regularization
- Analysis of global and local optima of regularized quantile regression in high dimensions: a subgradient approach
- scientific article; zbMATH DE number 7306923 (Why is no real title available?)
- Tractable Bayesian variable selection: beyond normality
- Distributed optimization and statistical learning for large-scale penalized expectile regression
- Adaptive elastic net-penalized quantile regression for variable selection
- Adaptive LASSO estimation for ARDL models with GARCH innovations
- A robust sparse linear approach for contaminated data
- Sparsity identification in ultra-high dimensional quantile regression models with longitudinal data
- Rates of convergence of the adaptive elastic net and the post-selection procedure in ultra-high dimensional sparse models
- Adaptive Bayesian SLOPE: Model Selection With Incomplete Data
- Adaptive efficient analysis for big data ergodic diffusion models
- scientific article; zbMATH DE number 7306878 (Why is no real title available?)
- Weighted l1‐Penalized Corrected Quantile Regression for High‐Dimensional Temporally Dependent Measurement Errors
- The main contributions of robust statistics to statistical science and a new challenge
- Semiparametric model averaging for ultrahigh-dimensional conditional quantile prediction
- Globally adaptive quantile regression with ultra-high dimensional data
- Fast optimization methods for high-dimensional row-sparse multivariate quantile linear regression
- Multi-block alternating direction method of multipliers for ultrahigh dimensional quantile fused regression
- Nonparametric estimation of conditional quantile functions in the presence of irrelevant covariates
- Forward variable selection for ultra-high dimensional quantile regression models
- Robust variable selection with application to quality of life research
- ADMM for High-Dimensional Sparse Penalized Quantile Regression
- Robust variable selection based on the random quantile LASSO
- Variable selection in high-dimensional linear model with possibly asymmetric errors
- Are Latent Factor Regression and Sparse Regression Adequate?
- Robust and smoothing variable selection for quantile regression models with longitudinal data
- A smoothing iterative method for quantile regression with nonconvex \(\ell_p\) penalty
- Robust variable selection and estimation via adaptive elastic net S-estimators for linear regression
- Screen then select: a strategy for correlated predictors in high-dimensional quantile regression
- Multiple influential point detection in high dimensional regression spaces
- Model selection consistency of Lasso for empirical data
- Robust shrinkage estimation and selection for functional multiple linear model through LAD loss
- Inference for High-Dimensional Censored Quantile Regression
- Expectile regression for analyzing heteroscedasticity in high dimension
- A network Lasso model for regression
- Overview of robust variable selection methods for high-dimensional linear regression model
- Double fused Lasso regularized regression with both matrix and vector valued predictors
- Adaptive robust variable selection
- Research and application of the weighted quantile regression with adaptive Lasso method
- Iterative adaptive robust variable selection in nomparametric additive models
- Iterative reweighted methods for \(\ell _1-\ell _p\) minimization
- Adaptive varying-coefficient linear quantile model: a profiled estimating equations approach
- Oracle estimation of a change point in high-dimensional quantile regression
- A Lasso-type robust variable selection for time-course microarray data
- A tuning-free robust and efficient approach to high-dimensional regression
- Elastic net penalized quantile regression model
- High-dimensional volatility matrix estimation with cross-sectional dependent and heavy-tailed microstructural noise
- High dimensional censored quantile regression
- scientific article; zbMATH DE number 7306908 (Why is no real title available?)
- Adaptively weighted group Lasso for semiparametric quantile regression models
- Balanced Bayesian Lasso for heavy tails
- Robust signed-rank variable selection in linear regression
- Penalized composite quasi-likelihood for ultrahigh dimensional variable selection
- Robust statistics: a selective overview and new directions
- Byzantine-robust and efficient distributed sparsity learning: a surrogate composite quantile regression approach
- ARFIS: an adaptive robust model for regression with heavy-tailed distribution
- scientific article; zbMATH DE number 7370575 (Why is no real title available?)
- scientific article; zbMATH DE number 7370628 (Why is no real title available?)
- Byzantine-robust distributed sparse learning for \(M\)-estimation
- Distributed adaptive lasso penalized generalized linear models for big data
- Regularized robust estimation in binary regression models
- Weighted lasso with data integration
- Comment on “A Tuning-Free Robust and Efficient Approach to High-Dimensional Regression”
- Robust high-dimensional tuning free multiple testing
- Variance estimation in high-dimensional linear regression via adaptive elastic-net
- Robust integrative analysis via quantile regression with homogeneity and sparsity
- Incorporating Graphical Structure of Predictors in Sparse Quantile Regression
- Penalized weighted smoothed quantile regression for high-dimensional longitudinal data
- Weighted likelihood transfer learning for high-dimensional generalized linear models
- Robust semiparametric gene-environment interaction analysis using sparse boosting
- The robust nearest shrunken centroids classifier for high-dimensional heavy-tailed data
- Globally Adaptive Longitudinal Quantile Regression With High Dimensional Compositional Covariates
- Regularized estimation for the least absolute relative error models with a diverging number of covariates
- Outlier detection and robust variable selection via the penalized weighted LAD-LASSO method
- Sparse quantile regression
- Sequential Scaled Sparse Factor Regression
- A neutral comparison of algorithms to minimize \(L_0\) penalties for high-dimensional variable selection
- Inference for high-dimensional linear expectile regression with de-biasing method
- Adaptive Huber Regression
- An elastic-net penalized expectile regression with applications
- Distributed Sparse Composite Quantile Regression in Ultrahigh Dimensions
- Model selection in high-dimensional quantile regression with seamless L₀ penalty
- Robust regression estimation and variable selection when cellwise and casewise outliers are present
- Large covariance estimation through elliptical factor models
- Robust network-based analysis of the associations between (epi)genetic measurements
- Transfer Learning with Large-Scale Quantile Regression
- Network-adaptive robust penalized estimation of time-varying coefficient models with longitudinal data
- Exponential squared loss based robust variable selection of AR models
- High-dimensional robust regression with \(L_q\)-loss functions
- Improved estimation method for high dimension semimartingale regression models based on discrete data
- Variable selection and structure identification for varying coefficient Cox models
This page was built for publication: Adaptive robust variable selection
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2448733)