C4.5
From MaRDI portal
Software:24107
swMATH12176MaRDI QIDQ24107FDOQ24107
Author name not available (Why is that?)
Cited In (only showing first 100 items - show all)
- On using Bayesian networks for complexity reduction in decision trees
- Modeling churn using customer lifetime value
- \texttt{OpenML}: an \texttt{R} package to connect to the machine learning platform OpenML
- MEPAR-miner: Multi-expression programming for classification rule mining
- Incremental tree-based missing data imputation with lexicographic ordering
- Boosting the margin: a new explanation for the effectiveness of voting methods
- TASC: two-attribute-set clustering through decision tree construction
- Supervised classification and mathematical optimization
- SIRUS: stable and interpretable RUle set for classification
- Recursive partitioning for missing data imputation in the presence of interaction effects
- Title not available (Why is that?)
- Wombit: a portfolio bit-vector solver using word-level propagation
- Enhancing evolutionary fuzzy systems for multi-class problems: distance-based relative competence weighting with truncated confidences (DRCW-TC)
- Empowering difficult classes with a similarity-based aggregation in multi-class classification problems
- A new variable selection approach using random forests
- Handling missing values when applying classification models
- partykit: a modular toolkit for recursive partytioning in \texttt{R}
- Arcing classifiers. (With discussion)
- WEKA -- experiences with a Java open-source project
- Statistical fraud detection: a review
- Modeling threshold interaction effects through the logistic classification trunk
- Attribute bagging: Improving accuracy of classifier ensembles by using random feature subsets.
- Learning certifiably optimal rule lists for categorical data
- XRules: an effective algorithm for structural classification of XML data
- Title not available (Why is that?)
- Finding a short and accurate decision rule in disjunctive normal form by exhaustive search
- A statistical approach to growing a reliable honest tree.
- Do we need hundreds of classifiers to solve real world classification problems?
- Learning ELM-tree from big data based on uncertainty reduction
- Modeling discrete time-to-event data
- Probabilistic confusion entropy for evaluating classifiers
- Discovery Science
- An interval set model for learning rules from incomplete information table
- Maximum-entropy estimated distribution model for classification problems
- Belief decision trees: Theoretical foundations
- Multi-criteria classification -- a new scheme for application of dominance-based decision rules
- Bayesian network classifiers for identifying the slope of the customer lifecycle of long-life customers.
- A self-adaptive multi-engine solver for quantified Boolean formulas
- An experimental evaluation of simplicity in rule learning
- Bayesian network classifiers
- Regranulation: a granular algorithm enabling communication between granular worlds
- T3C: improving a decision tree classification algorithm's interval splits on continuous attributes
- Unsupervised and supervised data classification via nonsmooth and global optimization (with comments and rejoinder)
- Comprehensible credit scoring models using rule extraction from support vector machines
- Selecting informative features with fuzzy-rough sets and its application for complex systems monitoring
- Using model trees for classification
- Uncertainty measures of rough set prediction
- Belief rule-based classification system: extension of FRBCS in belief functions framework
- ILA-2: AN INDUCTIVE LEARNING ALGORITHM FOR KNOWLEDGE DISCOVERY
- A rough-fuzzy approach for generating classification rules
- A simple greedy algorithm for finding functional relations: Efficient implementation and average case analysis
- Is it worth generating rules from neural network ensembles?
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- On the handling of fuzziness for continuous-valued attributes in decision tree generation
- Title not available (Why is that?)
- Title not available (Why is that?)
- Ensemble of optimal trees, random forest and random projection ensemble classification
- Predictive learning via rule ensembles
- The regression trunk approach to discover treatment covariate interaction
- Krimp: mining itemsets that compress
- Inducer: a public domain workbench for data mining
- Title not available (Why is that?)
- LFOIL: linguistic rule induction in the label semantics framework
- Classification trees with soft splits optimized for ranking
- BEST: a decision tree algorithm that handles missing values
- Ensemble of randomized soft decision trees for robust classification
- Title not available (Why is that?)
- Selection of relevant features and examples in machine learning
- Wrappers for feature subset selection
- Theoretical and empirical analysis of ReliefF and RReliefF
- Supersparse linear integer models for optimized medical scoring systems
- A new variable importance measure for random forests with missing data
- Classification of imbalanced data with a geometric digraph family
- Learning customized and optimized lists of rules with mathematical programming
- WHIRL: A word-based information representation language
- Bayesian treed models
- RHSBoost: improving classification performance in imbalance data
- Interpretable classifiers using rules and Bayesian analysis: building a better stroke prediction model
- Fu-SulfPred: identification of protein S-sulfenylation sites by fusing forests via Chou's general PseAAC
- Extremely randomized trees
- Logistic model trees
- A note on split selection bias in classification trees
- Supervised feature selection by clustering using conditional mutual information-based distances
- 10.1162/153244302320884605
- Learning a symbolic representation for multivariate time series classification
- Learning decision trees with taxonomy of propositionalized attributes
- An unbiased method for constructing multilabel classification trees
- Effect of pruning and early stopping on performance of a boosting ensemble.
- Computational methods for data analysis
- Improving malware detection by applying multi-inducer ensemble
- Supervised classification using probabilistic decision graphs
- Measures of ruleset quality for general rules extraction methods
- Inducing decision trees via concept lattices1
- Unbiased variable selection for classification trees with multivariate responses
- Detection of unknown computer worms based on behavioral classification of the host
- Design of nearest neighbor classifiers: Multi-objective approach
- Title not available (Why is that?)
- Title not available (Why is that?)
This page was built for software: C4.5