Easy Uncertainty Quantification (EasyUQ): Generating Predictive Distributions from Single-Valued Model Output
From MaRDI portal
Publication:6154539
Abstract: How can we quantify uncertainty if our favorite computational tool - be it a numerical, a statistical, or a machine learning approach, or just any computer model - provides single-valued output only? In this article, we introduce the Easy Uncertainty Quantification (EasyUQ) technique, which transforms real-valued model output into calibrated statistical distributions, based solely on training data of model output-outcome pairs, without any need to access model input. In its basic form, EasyUQ is a special case of the recently introduced Isotonic Distributional Regression (IDR) technique that leverages the pool-adjacent-violators algorithm for nonparametric isotonic regression. EasyUQ yields discrete predictive distributions that are calibrated and optimal in finite samples, subject to stochastic monotonicity. The workflow is fully automated, without any need for tuning. The Smooth EasyUQ approach supplements IDR with kernel smoothing, to yield continuous predictive distributions that preserve key properties of the basic form, including both, stochastic monotonicity with respect to the original model output, and asymptotic consistency. For the selection of kernel parameters, we introduce multiple one-fit grid search, a computationally much less demanding approximation to leave-one-out cross-validation. We use simulation examples and the WeatherBench challenge in data-driven weather prediction to illustrate the techniques. In a study of benchmark problems from machine learning, we show how EasyUQ and Smooth EasyUQ can be integrated into the workflow of modern neural network learning and hyperparameter tuning, and find EasyUQ to be competitive with more elaborate input-based approaches.
Recommendations
Cites work
- scientific article; zbMATH DE number 2168212 (Why is no real title available?)
- scientific article; zbMATH DE number 4001209 (Why is no real title available?)
- scientific article; zbMATH DE number 3388498 (Why is no real title available?)
- A comprehensive framework for verification, validation, and uncertainty quantification in scientific computing
- A review and comparison of bandwidth selection methods for kernel regression
- ACCRUE: accurate and reliable uncertainty estimate in deterministic models
- Accelerating the pool-adjacent-violators algorithm for isotonic distributional regression
- Analyzing stochastic computer models: a review with opportunities
- Bandwith selection for the smoothing of distribution functions
- Calibrated Probabilistic Mesoscale Weather Field Forecasting
- Computing electricity spot price prediction intervals using quantile regression and forecast averaging
- Distributional (Single) Index Models
- Distributional regression forests for probabilistic precipitation forecasting in complex terrain
- Ensemble forecasting
- Handbook of uncertainty quantification. In 2 volumes
- Inferences Under a Stochastic Ordering Constraint
- Introduction to uncertainty quantification
- Isotonic Distributional Regression
- Monotone least squares and isotonic quantiles
- Multivariate Student-t regression models: Pitfalls and inference
- Nonparametric shape-restricted regression
- Probabilistic Forecasts, Calibration and Sharpness
- Receiver operating characteristic (ROC) curves: equivalences, beta model, and minimum distance estimation
- Scoring Rules for Continuous Probability Distributions
- Stochastic orders
- Strictly Proper Scoring Rules, Prediction, and Estimation
- Uncertainty quantification in complex simulation models using ensemble copula coupling
- Uncertainty quantification. Theory, implementation, and applications
This page was built for publication: Easy Uncertainty Quantification (EasyUQ): Generating Predictive Distributions from Single-Valued Model Output
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6154539)