Trading Variance Reduction with Unbiasedness: The Regularized Subspace Information Criterion for Robust Model Selection in Kernel Regression
From MaRDI portal
Publication:4832480
DOI10.1162/089976604773135113zbMath1089.68111OpenAlexW2113530803WikidataQ42624385 ScholiaQ42624385MaRDI QIDQ4832480
Motoaki Kawanabe, Masashi Sugiyama, Klaus-Robert Müller
Publication date: 4 January 2005
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/089976604773135113
Related Items
Reward-Weighted Regression with Sample Reuse for Direct Policy Search in Reinforcement Learning, Semi-supervised speaker identification under covariate shift, Adaptive importance sampling for value function approximation in off-policy reinforcement learning, Semi-supervised learning based on high density region estimation, Direct importance estimation for covariate shift adaptation
Uses Software
Cites Work
- A comparison of GCV and GML for choosing the smoothing parameter in the generalized spline smoothing problem
- Asymptotic optimality of \(C_ L\) and generalized cross-validation in ridge regression with application to spline smoothing
- A note on generalized cross-validation with replicates
- Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross-validation
- Assessing the error probability of the model selection test
- An application of multiple comparison techniques to model selection
- Subspace Information Criterion for Model Selection
- On the mathematical foundations of learning
- 10.1162/153244303765208412
- Further analysis of the data by Akaike's information criterion and the finite corrections
- Ideal spatial adaptation by wavelet shrinkage
- Generalised information criteria in model selection
- 10.1162/153244302760200704
- De-noising by soft-thresholding
- Some Comments on C P
- Theory of Reproducing Kernels
- A new look at the statistical model identification