On the Improved Rates of Convergence for Matérn-Type Kernel Ridge Regression with Application to Calibration of Computer Models
DOI10.1137/19M1304222zbMath1459.62143arXiv2001.00152MaRDI QIDQ5149775
Yan Wang, Rui Tuo, C. F. Jeff Wu
Publication date: 8 February 2021
Published in: SIAM/ASA Journal on Uncertainty Quantification (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2001.00152
Computational methods for problems pertaining to statistics (62-08) Inference from spatial processes (62M30) Random fields; image analysis (62M40) Nonparametric regression and quantile regression (62G08) Ridge regression; shrinkage estimators (Lasso) (62J07) Hilbert spaces with reproducing kernels (= (proper) functional Hilbert spaces, including de Branges-Rovnyak and other structured spaces) (46E22)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Scaled Gaussian Stochastic Process for Computer Model Calibration and Prediction
- Approximation of eigenfunctions in kernel-based spaces
- Kernel ridge vs. principal component regression: minimax bounds and the qualification of regularization operators
- Statistics for high-dimensional data. Methods, theory and applications.
- Optimal rates for regularization of statistical inverse learning problems
- Efficient calibration for imperfect computer models
- Convergence rates for multivariate smoothing spline functions
- Interpolation of spatial data. Some theory for kriging
- The design and analysis of computer experiments.
- Optimal global rates of convergence for nonparametric regression
- Weak convergence and empirical processes. With applications to statistics
- Bayesian Calibration of Computer Models
- Improved error bounds for scattered data interpolation by radial basis functions
- Prediction based on the Kennedy-O’Hagan calibration model: asymptotic consistency and other properties
- An Introduction to Statistical Learning
- Adjustments to Computer Models via Projected Kernel Calibration
- Learning theory of distributed spectral algorithms
- The reproducing kernel Hilbert space structure of the sample paths of a Gaussian process
- A Theoretical Framework for Calibration in Computer Models: Parametrization, Estimation and Convergence Properties
- Scattered Data Approximation
- Smoothing spline ANOVA models
This page was built for publication: On the Improved Rates of Convergence for Matérn-Type Kernel Ridge Regression with Application to Calibration of Computer Models