The backbone method for ultra-high dimensional sparse machine learning
From MaRDI portal
Publication:2163249
Cites work
- scientific article; zbMATH DE number 3860199 (Why is no real title available?)
- scientific article; zbMATH DE number 5485440 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 6438182 (Why is no real title available?)
- 10.1162/153244303322753616
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A data-driven software tool for enabling cooperative information sharing among police departments
- A split-and-merge Bayesian variable selection approach for ultrahigh dimensional regression
- An error bound for \(L_1\)-norm support vector machine coefficients in ultra-high dimension
- An outer-approximation algorithm for a class of mixed-integer nonlinear programs
- Best subset selection via a modern optimization lens
- Branch-and-price: Column generation for solving huge integer programs
- Characterization of the equivalence of robustification and regularization in linear and matrix regression
- Distributed optimization and statistical learning via the alternating direction method of multipliers
- Divide-and-conquer for debiased \(l_1\)-norm support vector machine in ultra-high dimensions
- Enlarging the margins in perceptron decision trees
- Entropy-based model-free feature screening for ultrahigh-dimensional multiclass classification
- Fast best subset selection: coordinate descent and local combinatorial optimization algorithms
- Fifty years of classification and regression trees
- Gene selection for cancer classification using support vector machines
- Integer programming models for feature selection: new extensions and a randomized solution algorithm
- Learning Boolean concepts in the presence of many irrelevant features
- Least angle regression. (With discussion)
- MIP-BOOST: Efficient and Effective L0 Feature Selection for Linear Regression
- Making decision trees feasible in ultrahigh feature and label dimensions
- Mathematical optimization in classification and regression trees
- Multisurface method of pattern separation for medical diagnosis applied to breast cytology.
- Nearly unbiased variable selection under minimax concave penalty
- Nonparametric independence screening in sparse ultra-high-dimensional additive models
- Optimal classification trees
- Optimal randomized classification trees
- Optimization problems for machine learning: a survey
- Random forests
- Regularization and Variable Selection Via the Elastic Net
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- Robust Regression and Lasso
- Scalable algorithms for the sparse ridge regression
- Scikit-learn: machine learning in Python
- Searching for backbones -- an efficient parallel algorithm for the traveling salesman problem
- Sparse Approximate Solutions to Linear Systems
- Sparse high-dimensional regression: exact scalable algorithms and phase transitions
- Sparse learning via Boolean relaxations
- Sparse regression: scalable algorithms and empirical performance
- Sparsity in optimal randomized classification trees
- Supersparse linear integer models for optimized medical scoring systems
- Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
- Sure independence screening in generalized linear models with NP-dimensionality
- The all-or-nothing phenomenon in sparse linear regression
- Ultrahigh dimensional feature selection: beyond the linear model
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
Describes a project that uses
Uses Software
This page was built for publication: The backbone method for ultra-high dimensional sparse machine learning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2163249)