Rejoinder: ``Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons
From MaRDI portal
Publication:2225320
Cites work
Cited in
(13)- Adaptive iterative hard thresholding for least absolute deviation problems with sparsity constraints
- Leveraged least trimmed absolute deviations
- Integrating prediction in mean-variance portfolio optimization
- Scalable penalized spatiotemporal land-use regression for ground-level nitrogen dioxide
- MIP-BOOST: Efficient and Effective L0 Feature Selection for Linear Regression
- scientific article; zbMATH DE number 7370569 (Why is no real title available?)
- Interaction Model and Model Selection for Function-on-Function Regression
- Extending greedy feature selection algorithms to multiple solutions
- Approximate Selective Inference via Maximum Likelihood
- Mixed integer quadratic optimization formulations for eliminating multicollinearity based on variance inflation factor
- Sparsifying the least-squares approach to PCA: comparison of lasso and cardinality constraint
- Model Selection With Lasso-Zero: Adding Straw to the Haystack to Better Find Needles
- Sparse index tracking using sequential Monte Carlo
This page was built for publication: Rejoinder: ``Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2225320)