Distribution-free uncertainty quantification for kernel methods by gradient perturbations
From MaRDI portal
Publication:2320596
Abstract: We propose a data-driven approach to quantify the uncertainty of models constructed by kernel methods. Our approach minimizes the needed distributional assumptions, hence, instead of working with, for example, Gaussian processes or exponential families, it only requires knowledge about some mild regularity of the measurement noise, such as it is being symmetric or exchangeable. We show, by building on recent results from finite-sample system identification, that by perturbing the residuals in the gradient of the objective function, information can be extracted about the amount of uncertainty our model has. Particularly, we provide an algorithm to build exact, non-asymptotically guaranteed, distribution-free confidence regions for ideal, noise-free representations of the function we try to estimate. For the typical convex quadratic problems and symmetric noises, the regions are star convex centered around a given nominal estimate, and have efficient ellipsoidal outer approximations. Finally, we illustrate the ideas on typical kernel methods, such as LS-SVC, KRR, -SVR and kernelized LASSO.
Recommendations
- Probabilistic kernel support vector machines
- Efficient methods for robust classification under uncertainty in kernel matrices
- Toward a Kernel-Based Uncertainty Decomposition Framework for Data and Models
- Maximum likelihood estimation and uncertainty quantification for Gaussian process approximation of deterministic functions
- Can we trust Bayesian uncertainty quantification from Gaussian process priors with squared exponential covariance kernel?
Cites work
- scientific article; zbMATH DE number 1804115 (Why is no real title available?)
- scientific article; zbMATH DE number 1332320 (Why is no real title available?)
- scientific article; zbMATH DE number 708500 (Why is no real title available?)
- scientific article; zbMATH DE number 2168212 (Why is no real title available?)
- An introduction to support vector machines and other kernel-based learning methods.
- Gaussian processes for machine learning.
- Guaranteed non-asymptotic confidence regions in system identification
- Honest confidence regions for nonparametric regression
- Kernel methods in machine learning
- Kernel methods in system identification, machine learning and function estimation: a survey
- Mathematical foundations of infinite-dimensional statistical models
- Nonparametric regression, confidence regions and regularization
- Permutation, parametric and bootstrap tests of hypotheses.
- Sign-Perturbed Sums: A New System Identification Approach for Constructing Exact Non-Asymptotic Confidence Regions in Linear Regression Models
- Some results on Tchebycheffian spline functions and stochastic processes
- Support Vector Machines
- Theory of Reproducing Kernels
Cited in
(6)- Bayesian frequentist bounds for machine learning and system identification
- Facing undermodelling in sign-perturbed-sums system identification
- A simple condition for the boundedness of sign-perturbed-sums (SPS) confidence regions
- Easy Uncertainty Quantification (EasyUQ): Generating Predictive Distributions from Single-Valued Model Output
- Toward a Kernel-Based Uncertainty Decomposition Framework for Data and Models
- Scalable bounding of predictive uncertainty in regression problems with SLAC
This page was built for publication: Distribution-free uncertainty quantification for kernel methods by gradient perturbations
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2320596)