Predictive learning via rule ensembles
From MaRDI portal
Abstract: General regression and classification models are constructed as linear combinations of simple rules derived from the data. Each rule consists of a conjunction of a small number of simple statements concerning the values of individual input variables. These rule ensembles are shown to produce predictive accuracy comparable to the best methods. However, their principal advantage lies in interpretation. Because of its simple form, each rule is easy to understand, as is its influence on individual predictions, selected subsets of predictions, or globally over the entire space of joint input variable values. Similarly, the degree of relevance of the respective input variables can be assessed globally, locally in different regions of the input space, or at individual prediction points. Techniques are presented for automatically identifying those variables that are involved in interactions with other variables, the strength and degree of those interactions, as well as the identities of the other variables with which they interact. Graphical representations are used to visualize both main and interaction effects.
Recommendations
Cites work
- scientific article; zbMATH DE number 3860199 (Why is no real title available?)
- scientific article; zbMATH DE number 47282 (Why is no real title available?)
- scientific article; zbMATH DE number 708500 (Why is no real title available?)
- scientific article; zbMATH DE number 739533 (Why is no real title available?)
- scientific article; zbMATH DE number 1149421 (Why is no real title available?)
- scientific article; zbMATH DE number 1179314 (Why is no real title available?)
- scientific article; zbMATH DE number 1894664 (Why is no real title available?)
- scientific article; zbMATH DE number 784360 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 5274667 (Why is no real title available?)
- An overtraining-resistant stochastic modeling method for pattern recognition
- Bagging predictors
- Greedy function approximation: A gradient boosting machine.
- On bagging and nonlinear estimation
- Quasi-regression
- Random forests
- Robust Estimation of a Location Parameter
- The elements of statistical learning. Data mining, inference, and prediction
Cited in
(75)- Stochastic tree search for estimating optimal dynamic treatment regimes
- Data mining in electronic commerce
- Embedding black-box regression techniques into hierarchical Bayesian models
- On exact distribution for multivariate weighted distributions and classification
- scientific article; zbMATH DE number 861623 (Why is no real title available?)
- Interaction forests: identifying and exploiting interpretable quantitative and qualitative interaction effects
- Visualizing Variable Importance and Variable Interaction Effects in Machine Learning Models
- Consistent regression using data-dependent coverings
- Techniques to improve ecological interpretability of black-box machine learning models. Case study on biological health of streams in the United States with gradient boosted trees
- A Bayesian framework for learning rule sets for interpretable classification
- The Delaunay triangulation learner and its ensembles
- Explaining anomalies in groups with characterizing subspace rules
- Random Forests for Spatially Dependent Data
- Exploratory model comparison. Interactive model ensemble selection and management
- scientific article; zbMATH DE number 1832345 (Why is no real title available?)
- Constructing ensembles of symbolic classifiers
- An improved branch-and-bound method for maximum monomial agreement
- Node harvest
- A Shapley-Owen index for interaction quantification
- FOLD-R++: a scalable toolset for automated inductive learning of default theories from mixed data
- A \(\mathbb R\)eal generalization of discrete AdaBoost
- Inference on moderation effect with third-variable effect analysis – application to explore the trend of racial disparity in oncotype dx test for breast cancer treatment
- A survey on the explainability of supervised machine learning
- Tree ensembles with rule structured horseshoe regularization
- scientific article; zbMATH DE number 1966523 (Why is no real title available?)
- Seeing the Forest Through the Trees
- An ensemble approach for in silico prediction of Ames mutagenicity
- Supervised classification and mathematical optimization
- Linear Aggregation in Tree-Based Estimators
- Disjunctive Rule Lists
- Making complex prediction rules applicable for readers: current practice in random forest literature and recommendations
- Learning customized and optimized lists of rules with mathematical programming
- Ensemble classification based on generalized additive models
- Extreme value correction: a method for correcting optimistic estimations in rule learning
- scientific article; zbMATH DE number 6276187 (Why is no real title available?)
- pre
- iml
- Forest Garrote
- Classification rules in relaxed logical form
- Rationalizing predictions by adversarial information calibration
- Conclusive local interpretation rules for random forests
- Interpreting deep learning models with marginal attribution by conditioning on quantiles
- rules
- Considerations when learning additive explanations for black-box models
- Interpretation of black-box predictive models
- Interpretable regularized class association rules algorithm for classification in a categorical data space
- On \(b\)-bit min-wise hashing for large-scale regression and classification with sparse data
- xrf
- Interpretable classifiers using rules and Bayesian analysis: building a better stroke prediction model
- GARROTE TREES AS TREE STRUCTURED REGRESSION ANALYSIS
- A pool-based pattern generation algorithm for logical analysis of data with automatic fine-tuning
- SIRUS: stable and interpretable RUle set for classification
- Model transparency and interpretability: survey and application to the insurance industry
- Boosting insights in insurance tariff plans with tree-based machine learning methods
- Recent advances in the theory and practice of logical analysis of data
- Testing conditional independence in supervised learning algorithms
- Interactive Slice Visualization for Exploring Machine Learning Models
- Crossed-derivative based sensitivity measures for interaction screening
- hstats
- IADT
- Detection of interacting variables for generalized linear models via neural networks
- Rule ensemble method with adaptive group Lasso for heterogeneous treatment effect estimation
- scientific article; zbMATH DE number 7750672 (Why is no real title available?)
- Guiding Local Regression Using Visualisation
- Supervised Machine Learning Techniques: An Overview with Applications to Banking
- SOAR: simultaneous or-of-and rules for classification of positive and negative classes
- Firms' profitability and ESG score: a machine learning approach
- Generating explainable rule sets from tree-ensemble learning methods by answer set programming
- Neural-symbolic temporal decision trees for multivariate time series classification
- Patterns of differential expression by association in omic data using a new measure based on ensemble learning
- Interaction analysis under misspecification of main effects: some common mistakes and simple solutions
- Additive regression trees and smoothing splines -- predictive modeling and interpretation in data mining
- Spatial performance analysis in basketball with CART, random forest and extremely randomized trees
- Understanding the effect of contextual factors and decision making on team performance in Twenty20 cricket: an interpretable machine learning approach
- On the efficient implementation of classification rule learning
This page was built for publication: Predictive learning via rule ensembles
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q71543)