Variable selection and parameter estimation via WLAD-SCAD with a diverging number of parameters
From MaRDI portal
Publication:2398409
DOI10.1016/J.JKSS.2016.12.003zbMATH Open1368.62210OpenAlexW2573268497MaRDI QIDQ2398409FDOQ2398409
Authors: Yanxin Wang, L. Zhu
Publication date: 16 August 2017
Published in: Journal of the Korean Statistical Society (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jkss.2016.12.003
Recommendations
- Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression
- WLAD-LASSO method for robust estimation and variable selection in partially linear models
- SCAD-penalized least absolute deviation regression in high-dimensional models
- Outlier detection and robust variable selection via the penalized weighted LAD-LASSO method
- SCAD penalized rank regression with a diverging number of parameters
Ridge regression; shrinkage estimators (Lasso) (62J07) Robustness and adaptive procedures (parametric inference) (62F35)
Cites Work
- The Adaptive Lasso and Its Oracle Properties
- Extended Bayesian information criteria for model selection with large model spaces
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Title not available (Why is that?)
- One-step sparse estimates in nonconcave penalized likelihood models
- Robust regression: Asymptotics, conjectures and Monte Carlo
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Hedonic housing prices and the demand for clean air
- Limiting distributions for \(L_1\) regression estimators under general conditions
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
- Adaptive robust variable selection
- Shrinkage tuning parameter selection with a diverging number of parameters
- Unified LASSO Estimation by Least Squares Approximation
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- Nonconcave penalized likelihood with a diverging number of parameters.
- Quantile Regression for Analyzing Heterogeneity in Ultra-High Dimension
- Composite quantile regression and the oracle model selection theory
- The \(L_1\) penalized LAD estimator for high dimensional linear regression
- \(M\)-estimation of linear models with dependent errors
- Nonconcave penalized M-estimation with a diverging number of parameters
- Model Selection via Bayesian Information Criterion for Quantile Regression Models
- Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression
- Nonconcave Penalized Likelihood With NP-Dimensionality
- Simultaneous estimation and variable selection in median regression using Lasso-type penalty
- Variable selection in quantile regression
- Asymptotic behavior of M estimators of p regression parameters when \(p^ 2/n\) is large. II: Normal approximation
- Robust weighted LAD regression
- Weighted Wilcoxon‐Type Smoothly Clipped Absolute Deviation Method
- Leverage and Breakdown in L 1 Regression
- A mathematical programming approach for improving the robustness of least sum of absolute deviations regression
- Consistent tuning parameter selection in high dimensional sparse linear regression
- Robust spline-based variable selection in varying coefficient model
Cited In (9)
- Doubly robust weighted composite quantile regression based on SCAD‐L2
- Mathematical programming for simultaneous feature selection and outlier detection under l1 norm
- Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression
- Outlier detection and robust variable selection via the penalized weighted LAD-LASSO method
- SCAD penalized rank regression with a diverging number of parameters
- Group selection via adjusted weighted least absolute deviation regression
- WLAD-LASSO method for robust estimation and variable selection in partially linear models
- Robust check loss-based inference of semiparametric models and its application in environmental data
- SCAD-penalized least absolute deviation regression in high-dimensional models
This page was built for publication: Variable selection and parameter estimation via WLAD-SCAD with a diverging number of parameters
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2398409)