Feature selection with ensembles, artificial variables, and redundancy elimination
From MaRDI portal
Publication:2880928
zbMATH Open1235.62003MaRDI QIDQ2880928FDOQ2880928
Authors: Eugene Tuv, Alexander Borisov, George Runger, Kari Torkkola
Publication date: 17 April 2012
Published in: Journal of Machine Learning Research (JMLR) (Search for Journal in Brave)
Full work available at URL: http://www.jmlr.org/papers/v10/tuv09a.html
Recommendations
- Ensemble feature selection for high dimensional data: a new method and a comparative study
- Feature selection for ensemble learning
- Information-theoretic algorithm for feature selection
- Ensemble Algorithms for Feature Selection
- Integer programming models for feature selection: new extensions and a randomized solution algorithm
Cited In (10)
- Machine learning in corporate credit rating assessment using the expanded audit report
- Unsupervised feature selection with ensemble learning
- Benchmark and Survey of Automated Machine Learning Frameworks
- An efficient random forests algorithm for high dimensional data classification
- Two-level quantile regression forests for bias correction in range prediction
- A hybrid system with filter approach and multiple population genetic algorithm for feature selection in credit scoring
- Unrestricted permutation forces extrapolation: variable importance requires at least one more model, or there is no free variable importance
- Sparse parameter identification of stochastic dynamical systems
- Support Recovery and Parameter Identification of Multivariate ARMA Systems with Exogenous Inputs
- Fault detection and isolation of gas turbine: hierarchical classification and confidence rate computation
This page was built for publication: Feature selection with ensembles, artificial variables, and redundancy elimination
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2880928)