A neutral comparison of algorithms to minimize L₀ penalties for high-dimensional variable selection
From MaRDI portal
Publication:6625366
Cites work
- scientific article; zbMATH DE number 6982301 (Why is no real title available?)
- scientific article; zbMATH DE number 7306923 (Why is no real title available?)
- A Mixed-Integer Fractional Optimization Approach to Best Subset Selection
- A new look at the statistical model identification
- Adaptive robust variable selection
- Estimating the dimension of a model
- Extending the Modified Bayesian Information Criterion (mBIC) to Dense Markers and Multiple Interval Mapping
- False discoveries occur early on the Lasso path
- Fast best subset selection: coordinate descent and local combinatorial optimization algorithms
- GSDAR: a fast Newton algorithm for \(\ell_0\) regularized generalized linear models with statistical guarantee
- Penalized composite quasi-likelihood for ultrahigh dimensional variable selection
- Phenotypes and genotypes. The search for influential genes
- Sparse regression: scalable algorithms and empirical performance
- Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
- The risk inflation criterion for multiple regression
- To explain or to predict?
- Variable selection -- a review and recommendations for the practicing statistician
This page was built for publication: A neutral comparison of algorithms to minimize \(L_0\) penalties for high-dimensional variable selection
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6625366)