SCAD penalized rank regression with a diverging number of parameters
From MaRDI portal
Publication:476249
DOI10.1016/j.jmva.2014.09.014zbMath1302.62130OpenAlexW1999681893MaRDI QIDQ476249
Publication date: 28 November 2014
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jmva.2014.09.014
Related Items (6)
Rank method for partial functional linear regression models ⋮ On the oracle property of a generalized adaptive elastic-net for multivariate linear regression with a diverging number of parameters ⋮ Recognition and variable selection in sparse spatial panel data models with fixed effects ⋮ A semi-parametric approach to feature selection in high-dimensional linear regression models ⋮ Penalized profile least squares-based statistical inference for varying coefficient partially linear errors-in-variables models ⋮ Efficient and doubly-robust methods for variable selection and parameter estimation in longitudinal data analysis
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Model-Free Feature Screening for Ultrahigh-Dimensional Data
- Least squares approximation with a diverging number of parameters
- Composite quantile regression and the oracle model selection theory
- One-step sparse estimates in nonconcave penalized likelihood models
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Limiting distributions for \(L_1\) regression estimators under general conditions
- Nonconcave penalized likelihood with a diverging number of parameters.
- On the adaptive elastic net with a diverging number of parameters
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
- Coordinate descent algorithms for lasso penalized regression
- Wilcoxon-type generalized Bayesian information criterion
- Weighted Wilcoxon‐Type Smoothly Clipped Absolute Deviation Method
- Estimating Regression Coefficients by Minimizing the Dispersion of the Residuals
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Feature Screening via Distance Correlation Learning
- Asymptotic oracle properties of SCAD-penalized least squares estimators
- Tuning parameter selectors for the smoothly clipped absolute deviation method
This page was built for publication: SCAD penalized rank regression with a diverging number of parameters