High-dimensional variable selection with reciprocal L₁-regularization
From MaRDI portal
Publication:5367472
DOI10.1080/01621459.2014.984812zbMATH Open1373.62358OpenAlexW1992571403MaRDI QIDQ5367472FDOQ5367472
Publication date: 13 October 2017
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/01621459.2014.984812
Recommendations
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Regularizing LASSO: a consistent variable selection method
- Nearly unbiased variable selection under minimax concave penalty
- Variable selection and estimation using a continuous approximation to the \(L_0\) penalty
- Laplace error penalty-based variable selection in high dimension
Lassovariable selectionsmoothly clipped absolute deviation (SCAD)penalized likelihood methodsreciprocal Lassostochastic approximation annealing
Cited In (19)
- Optimal false discovery control of minimax estimators
- Bayesian reciprocal LASSO quantile regression
- Markov Neighborhood Regression for High-Dimensional Inference
- Nearly optimal Bayesian shrinkage for high-dimensional regression
- The reciprocal Bayesian bridge for left-censored data
- Title not available (Why is that?)
- An overview of reciprocal \(L_1\)-regularization for high dimensional regression data
- Spatial Homogeneity Pursuit of Regression Coefficients for Large Datasets
- Markov neighborhood regression for statistical inference of high-dimensional generalized linear models
- \(L_1\)-regularized least squares for support recovery of high dimensional single index models with Gaussian designs
- A binary hidden Markov model on spatial network for amyotrophic lateral sclerosis disease spreading pattern analysis
- The reciprocal Bayesian Lasso
- Regularization through variable selection and conditional MLE with application to classification in high dimensions
- High Dimensional Variable Selection via Tilting
- Bayesian Neural Networks for Selection of Drug Sensitive Genes
- Partitioned Approach for High-dimensional Confidence Intervals with Large Split Sizes
- High-dimensional posterior consistency for hierarchical non-local priors in regression
- Title not available (Why is that?)
- Variable selection for nonparametric quantile regression via measurement error model
This page was built for publication: High-dimensional variable selection with reciprocal \(L_{1}\)-regularization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5367472)