L0Learn
From MaRDI portal
Software:52543
swMATH36841CRANL0LearnMaRDI QIDQ52543FDOQ52543
Fast Algorithms for Best Subset Selection
Rahul Mazumder, Hussein Hazimeh, Tim Nonet
Last update: 7 March 2023
Copyright license: MIT license, File License
Software version identifier: 2.1.0
Source code repository: https://github.com/cran/L0Learn
Cited In (21)
- Subset selection in network-linked data
- A Mixed-Integer Fractional Optimization Approach to Best Subset Selection
- Discussion of ``Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons
- Rejoinder: ``Sparse regression: scalable algorithms and empirical performance
- Semi-automated simultaneous predictor selection for regression-SARIMA models
- Sparse classification: a scalable discrete optimization perspective
- MIP-BOOST: Efficient and Effective L0 Feature Selection for Linear Regression
- Title not available (Why is that?)
- Title not available (Why is that?)
- A discussion on practical considerations with sparse regression methodologies
- An extended Newton-type algorithm for \(\ell_2\)-regularized sparse logistic regression and its efficiency for classifying large-scale datasets
- Rejoinder: ``Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons
- The backbone method for ultra-high dimensional sparse machine learning
- Graph structured sparse subset selection
- Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons
- Matrix completion with nonconvex regularization: spectral operators and scalable algorithms
- Robust subset selection
- Randomized Gradient Boosting Machine
- The Trimmed Lasso: Sparse Recovery Guarantees and Practical Optimization by the Generalized Soft-Min Penalty
- Scalable Algorithms for the Sparse Ridge Regression
- Mining events with declassified diplomatic documents
This page was built for software: L0Learn