Learning Boolean concepts in the presence of many irrelevant features
From MaRDI portal
Publication:1337683
DOI10.1016/0004-3702(94)90084-1zbMATH Open0942.68657OpenAlexW2039537889MaRDI QIDQ1337683FDOQ1337683
Hussein Almuallim, Thomas G. Dietterich
Publication date: 26 February 1996
Published in: Artificial Intelligence (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0004-3702(94)90084-1
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- A Greedy Heuristic for the Set-Covering Problem
- Learnability and the Vapnik-Chervonenkis dimension
- A Branch and Bound Algorithm for Feature Subset Selection
- A general lower bound on the number of examples needed for learning
- Occam's razor
- A Comparison of Seven Techniques for Choosing Subsets of Pattern Recognition Properties
- Optimum feature selection by zero-one integer programming
- A note on some feature selection criteria
Cited In (29)
- TCMI: a non-parametric mutual-dependence estimator for multivariate continuous distributions
- An adaptive heuristic for feature selection based on complementarity
- Fuzzy similarity and entropy (FSAE) feature selection revisited by using intra-class entropy and a normalized scaling factor
- Feature selection based on a modified fuzzy C-means algorithm with supervision
- Reformulation of the support set selection problem in the logical analysis of data
- Approximate inference of functional dependencies from relations
- Feature selection for support vector machines using generalized Benders decomposition
- Parameterized learnability of juntas
- DB-HReduction: a data preprocessing algorithm for data mining applications
- Partial Occam's Razor and its applications
- Improved feature selection with simulation optimization
- Genetic algorithm-based feature set partitioning for classification problems
- Integer programming models for feature selection: new extensions and a randomized solution algorithm
- Bagging constraint score for feature selection with pairwise constraints
- The backbone method for ultra-high dimensional sparse machine learning
- Using simulated annealing to optimize the feature selection problem in marketing applications
- Efficient feature selection based on correlation measure between continuous and discrete features
- Optimization-based feature selection with adaptive instance sampling
- Consistency-based search in feature selection
- Evaluating feature selection methods for learning in data mining applications.
- Feature selection in machine learning via variable neighborhood search
- Application of a Generalization of Russo's Formula to Learning from Multiple Random Oracles
- Pruning boxes in a box-based classification method
- Parameterized Learnability of k-Juntas and Related Problems
- Feature selection algorithms in classification problems: an experimental evaluation
- Correntropy based feature selection using binary projection
- Wrappers for feature subset selection
- Multi‐objective feature selection using a Bayesian artificial immune system
- A novel feature selection approach: combining feature wrappers and filters
This page was built for publication: Learning Boolean concepts in the presence of many irrelevant features
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1337683)