Variable selection and parameter estimation via WLAD-SCAD with a diverging number of parameters
From MaRDI portal
Publication:2398409
DOI10.1016/j.jkss.2016.12.003zbMath1368.62210OpenAlexW2573268497MaRDI QIDQ2398409
Publication date: 16 August 2017
Published in: Journal of the Korean Statistical Society (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jkss.2016.12.003
Ridge regression; shrinkage estimators (Lasso) (62J07) Robustness and adaptive procedures (parametric inference) (62F35)
Related Items (3)
Group selection via adjusted weighted least absolute deviation regression ⋮ Doubly robust weighted composite quantile regression based on SCAD‐L2 ⋮ Robust check loss-based inference of semiparametric models and its application in environmental data
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- The \(L_1\) penalized LAD estimator for high dimensional linear regression
- Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression
- Consistent tuning parameter selection in high dimensional sparse linear regression
- Simultaneous estimation and variable selection in median regression using Lasso-type penalty
- Composite quantile regression and the oracle model selection theory
- One-step sparse estimates in nonconcave penalized likelihood models
- Robust weighted LAD regression
- \(M\)-estimation of linear models with dependent errors
- Asymptotic behavior of M estimators of p regression parameters when \(p^ 2/n\) is large. II: Normal approximation
- Hedonic housing prices and the demand for clean air
- Limiting distributions for \(L_1\) regression estimators under general conditions
- Robust regression: Asymptotics, conjectures and Monte Carlo
- Nonconcave penalized likelihood with a diverging number of parameters.
- Robust spline-based variable selection in varying coefficient model
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
- Adaptive robust variable selection
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Shrinkage Tuning Parameter Selection with a Diverging number of Parameters
- Extended Bayesian information criteria for model selection with large model spaces
- Unified LASSO Estimation by Least Squares Approximation
- Weighted Wilcoxon‐Type Smoothly Clipped Absolute Deviation Method
- Leverage and Breakdown in L 1 Regression
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Quantile Regression for Analyzing Heterogeneity in Ultra-High Dimension
- Model Selection via Bayesian Information Criterion for Quantile Regression Models
- Nonconcave Penalized Likelihood With NP-Dimensionality
- A mathematical programming approach for improving the robustness of least sum of absolute deviations regression
- Tuning parameter selectors for the smoothly clipped absolute deviation method
This page was built for publication: Variable selection and parameter estimation via WLAD-SCAD with a diverging number of parameters