Learning Boolean concepts in the presence of many irrelevant features
From MaRDI portal
Publication:1337683
DOI10.1016/0004-3702(94)90084-1zbMath0942.68657OpenAlexW2039537889MaRDI QIDQ1337683
Hussein Almuallim, Thomas G. Dietterich
Publication date: 26 February 1996
Published in: Artificial Intelligence (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0004-3702(94)90084-1
Related Items (29)
Using simulated annealing to optimize the feature selection problem in marketing applications ⋮ Feature selection for support vector machines using generalized Benders decomposition ⋮ Integer programming models for feature selection: new extensions and a randomized solution algorithm ⋮ The backbone method for ultra-high dimensional sparse machine learning ⋮ A novel feature selection approach: combining feature wrappers and filters ⋮ Wrappers for feature subset selection ⋮ Partial Occam's Razor and its applications ⋮ Pruning boxes in a box-based classification method ⋮ Efficient feature selection based on correlation measure between continuous and discrete features ⋮ Feature selection in machine learning via variable neighborhood search ⋮ Improved feature selection with simulation optimization ⋮ Parameterized Learnability of k-Juntas and Related Problems ⋮ An adaptive heuristic for feature selection based on complementarity ⋮ Evaluating feature selection methods for learning in data mining applications. ⋮ DB-HReduction: a data preprocessing algorithm for data mining applications ⋮ Approximate inference of functional dependencies from relations ⋮ Application of a Generalization of Russo's Formula to Learning from Multiple Random Oracles ⋮ Genetic algorithm-based feature set partitioning for classification problems ⋮ Reformulation of the support set selection problem in the logical analysis of data ⋮ Bagging constraint score for feature selection with pairwise constraints ⋮ Multi‐objective feature selection using a Bayesian artificial immune system ⋮ Optimization-based feature selection with adaptive instance sampling ⋮ Correntropy based feature selection using binary projection ⋮ Feature selection algorithms in classification problems: an experimental evaluation ⋮ Feature selection based on a modified fuzzy C-means algorithm with supervision ⋮ Parameterized learnability of juntas ⋮ TCMI: a non-parametric mutual-dependence estimator for multivariate continuous distributions ⋮ Fuzzy similarity and entropy (FSAE) feature selection revisited by using intra-class entropy and a normalized scaling factor ⋮ Consistency-based search in feature selection
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Occam's razor
- A general lower bound on the number of examples needed for learning
- Optimum feature selection by zero-one integer programming
- Learnability and the Vapnik-Chervonenkis dimension
- A Greedy Heuristic for the Set-Covering Problem
- A Branch and Bound Algorithm for Feature Subset Selection
- A note on some feature selection criteria
- A Comparison of Seven Techniques for Choosing Subsets of Pattern Recognition Properties
This page was built for publication: Learning Boolean concepts in the presence of many irrelevant features