Variable selection and parameter estimation via WLAD-SCAD with a diverging number of parameters
From MaRDI portal
Publication:2398409
Recommendations
- Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression
- WLAD-LASSO method for robust estimation and variable selection in partially linear models
- SCAD-penalized least absolute deviation regression in high-dimensional models
- Outlier detection and robust variable selection via the penalized weighted LAD-LASSO method
- SCAD penalized rank regression with a diverging number of parameters
Cites work
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A mathematical programming approach for improving the robustness of least sum of absolute deviations regression
- Adaptive robust variable selection
- Asymptotic behavior of M estimators of p regression parameters when \(p^ 2/n\) is large. II: Normal approximation
- Composite quantile regression and the oracle model selection theory
- Consistent tuning parameter selection in high dimensional sparse linear regression
- Extended Bayesian information criteria for model selection with large model spaces
- Hedonic housing prices and the demand for clean air
- Leverage and Breakdown in L 1 Regression
- Limiting distributions for \(L_1\) regression estimators under general conditions
- Model Selection via Bayesian Information Criterion for Quantile Regression Models
- Nonconcave Penalized Likelihood With NP-Dimensionality
- Nonconcave penalized M-estimation with a diverging number of parameters
- Nonconcave penalized likelihood with a diverging number of parameters.
- One-step sparse estimates in nonconcave penalized likelihood models
- Quantile Regression for Analyzing Heterogeneity in Ultra-High Dimension
- Robust regression: Asymptotics, conjectures and Monte Carlo
- Robust spline-based variable selection in varying coefficient model
- Robust weighted LAD regression
- Shrinkage tuning parameter selection with a diverging number of parameters
- Simultaneous estimation and variable selection in median regression using Lasso-type penalty
- The Adaptive Lasso and Its Oracle Properties
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- The \(L_1\) penalized LAD estimator for high dimensional linear regression
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- Unified LASSO Estimation by Least Squares Approximation
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Variable selection in quantile regression
- Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression
- Weighted Wilcoxon‐Type Smoothly Clipped Absolute Deviation Method
- \(M\)-estimation of linear models with dependent errors
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
Cited in
(9)- WLAD-LASSO method for robust estimation and variable selection in partially linear models
- Group selection via adjusted weighted least absolute deviation regression
- Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression
- Robust check loss-based inference of semiparametric models and its application in environmental data
- SCAD-penalized least absolute deviation regression in high-dimensional models
- Outlier detection and robust variable selection via the penalized weighted LAD-LASSO method
- Mathematical programming for simultaneous feature selection and outlier detection under l1 norm
- Doubly robust weighted composite quantile regression based on SCAD‐L2
- SCAD penalized rank regression with a diverging number of parameters
This page was built for publication: Variable selection and parameter estimation via WLAD-SCAD with a diverging number of parameters
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2398409)