Probability estimation with machine learning methods for dichotomous and multicategory outcome: theory
DOI10.1002/BIMJ.201300068zbMATH Open1441.62404OpenAlexW1945743190WikidataQ38183339 ScholiaQ38183339MaRDI QIDQ2875744FDOQ2875744
Authors: Jochen Kruppa, Yufeng Liu, Gérard Biau, Michael Kohler, Inke R. König, James D. Malley, Andreas Ziegler
Publication date: 11 August 2014
Published in: Biometrical Journal (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1002/bimj.201300068
Recommendations
- Probability estimation with machine learning methods for dichotomous and multicategory outcome: applications
- Rejoinder to: Probability estimation with machine learning methods for dichotomous and multicategory outcome
- Correction: Probability estimation with machine learning methods for dichotomous and multicategory outcome: Applications
- Multiclass Probability Estimation With Support Vector Machines
- Probability estimation for large-margin classifiers
- An efficient model-free estimation of multiclass conditional probability
- scientific article; zbMATH DE number 755572
- scientific article; zbMATH DE number 1140598
nonparametric regressionprobability estimationsupport vector machinerandom forestbagged nearest neighbor
Nonparametric regression and quantile regression (62G08) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Applications of statistics to biology and medical sciences; meta analysis (62P10) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- Consistency of random forests and other averaging classifiers
- Title not available (Why is that?)
- Title not available (Why is that?)
- Strictly Proper Scoring Rules, Prediction, and Estimation
- Random forests
- Bagging predictors
- Simultaneous analysis of Lasso and Dantzig selector
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Random Forests and Adaptive Nearest Neighbors
- On bagging and nonlinear estimation
- Consistent nonparametric regression. Discussion
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Sparsity oracle inequalities for the Lasso
- Title not available (Why is that?)
- Multivariable Model‐Building
- Robust Truncated Hinge Loss Support Vector Machines
- Multicategory Support Vector Machines
- Optimal global rates of convergence for nonparametric regression
- On the mathematical foundations of learning
- Regression modeling strategies. With applications to linear models, logistic regression and survival analysis
- Analyzing bagging
- Quantile regression forests
- Logistic disease incidence models and case-control studies
- Optimal weighted nearest neighbour classifiers
- A distribution-free theory of nonparametric regression
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Soft and hard classification by reproducing kernel Hilbert space methods
- Title not available (Why is that?)
- 10.1162/15324430260185646
- Convexity, Classification, and Risk Bounds
- Clinical prediction models. A practical approach to development, validation, and updating.
- Probability estimates for multi-class classification by pairwise coupling
- Choosing multiple parameters for support vector machines
- Tree induction for probability-based ranking
- Probability estimation for large-margin classifiers
- Robust Model-Free Multiclass Probability Estimation
- Hard or soft classification? Large-margin unified machines
- OnL1-Norm Multiclass Support Vector Machines
- Consistent window estimation in nonparametric regression
- On the strong universal consistency of nearest neighbor regression function estimates
- Fast rates for support vector machines using Gaussian kernels
- Regression trees for predicting mortality in patients with cardiovascular disease: what improvement is achieved by using ensemble-based methods?
- Title not available (Why is that?)
- Support vector machines with applications
- Distribution-free consistency results in nonparametric discrimination and regression function estimation
- Nonparametric regression estimation using penalized least squares
- Nonparametric estimation via empirical risk minimization
- On the layered nearest neighbour estimate, the bagged nearest neighbour estimate and the random forest method in regression and classification
- Rates of convergence for partitioning and nearest neighbor regression estimates with unbounded data
- Optimal global rates of convergence for nonparametric regression with unbounded data
- Any Discrimination Rule Can Have an Arbitrarily Bad Probability of Error for Finite Sample Size
- Statistical analysis of some multi-category large margin classification methods
- Universal consistency of local polynomial kernel regression estimates
- Strong universal consistency of smooth kernel regression estimates
- Boosted classification trees and class probability/quantile estimation
- Non-crossing large-margin probability estimation and its application to robust SVM via pre\-condi\-tion\-ing
- Variance reduction in purely random forests
- Properties of Bagged Nearest Neighbour Classifiers
- On the rate of convergence of the bagged nearest neighbor estimate
- Risk models in genetic epidemiology
- Statistical learning for biomedical data
- Probability estimation with machine learning methods for dichotomous and multicategory outcome: applications
- Leveraging external knowledge on molecular interactions in classification methods for risk prediction of patients
Cited In (20)
- Calibrating machine learning approaches for probability estimation: a comprehensive comparison
- Support Vector Machines, Kernel Logistic Regression and Boosting
- Spatial performance analysis in basketball with CART, random forest and extremely randomized trees
- Title not available (Why is that?)
- A random forest guided tour
- Variable selection in large margin classifier-based probability estimation with high-dimensional predictors
- Machine learning versus statistical modeling
- Rejoinder to: Probability estimation with machine learning methods for dichotomous and multicategory outcome
- The use of components' weights improves the diagnostic accuracy of a health-related index
- Correction: Probability estimation with machine learning methods for dichotomous and multicategory outcome: Applications
- Probability estimation with machine learning methods for dichotomous and multicategory outcome: applications
- A Categorical Principal Component Regression on Computer-Assisted Instruction in Probability Domain
- Aleatoric and epistemic uncertainty in machine learning: an introduction to concepts and methods
- Hypervolume under ROC manifold for discrete biomarkers with ties
- Comparison of various machine learning algorithms for estimating generalized propensity score
- Methods for correcting inference based on outcomes predicted by machine learning
- Class probability estimation for medical studies
- Probability estimation and machine learning -- editorial
- Risk prediction with machine learning and regression methods
- What subject matter questions motivate the use of machine learning approaches compared to statistical models for probability prediction?
Uses Software
This page was built for publication: Probability estimation with machine learning methods for dichotomous and multicategory outcome: theory
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2875744)