Building Trees for Probabilistic Prediction via Scoring Rules
From MaRDI portal
Publication:6637486
DOI10.1080/00401706.2024.2343062MaRDI QIDQ6637486FDOQ6637486
Authors: Sara Shashaani, Matthew Plumlee, Seth D. Guikema
Publication date: 13 November 2024
Published in: Technometrics (Search for Journal in Brave)
Cites Work
- An introduction to statistical learning. With applications in R
- Generalized random forests
- Coherent dispersion criteria for optimal experimental design
- Consistency of random forests
- Title not available (Why is that?)
- Strictly Proper Scoring Rules, Prediction, and Estimation
- Probabilistic Forecasts, Calibration and Sharpness
- Recursive partitioning for heterogeneous causal effects
- Survival Trees by Goodness of Split
- The tight constant in the Dvoretzky-Kiefer-Wolfowitz inequality
- Quantile regression forests
- Asymptotic Minimax Character of the Sample Distribution Function and of the Classical Multinomial Estimator
- Statistical Methods for Eliciting Probability Distributions
- Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory
- The Elements of Statistical Learning
- The geometry of proper scoring rules
- Consistent nonparametric regression from recursive partitioning schemes
- Building consistent regression trees from complex sample data
- Technical note: Some properties of splitting criteria
- Estimation of the continuous ranked probability score with limited information and applications to ensemble weather forecasts
- An Overview of Applications of Proper Scoring Rules
- Proper Scoring Rules for Evaluating Density Forecasts with Asymmetric Loss Functions
This page was built for publication: Building Trees for Probabilistic Prediction via Scoring Rules
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6637486)