Analysis of regularized Nyström subsampling for regression functions of low smoothness
From MaRDI portal
Publication:5236751
Abstract: This paper studies a Nystr"om type subsampling approach to large kernel learning methods in the misspecified case, where the target function is not assumed to belong to the reproducing kernel Hilbert space generated by the underlying kernel. This case is less understood, in spite of its practical importance. To model such a case, the smoothness of target functions is described in terms of general source conditions. It is surprising that almost for the whole range of the source conditions, describing the misspecified case, the corresponding learning rate bounds can be achieved with just one value of the regularization parameter. This observation allows a formulation of mild conditions under which the plain Nystr"om subsampling can be realized with subquadratic cost maintaining the guaranteed learning rates.
Recommendations
- Regularized Nyström subsampling in regression and ranking problems under general smoothness assumptions
- Nyström subsampling method for coefficient-based regularized regression
- Nyström type subsampling analyzed as a regularized projection
- Subgradient and sampling algorithms for \(\ell_1\) regression
- Approximation of least squares regression on nested subspaces
- Manifold regularization based on Nyström type subsampling
- On Estimation Accuracy for Nonsmooth Functionals of Regression
- scientific article; zbMATH DE number 3976099
- Least-square regularized regression with non-iid sampling
Cites work
- scientific article; zbMATH DE number 936298 (Why is no real title available?)
- A numerical differentiation method and its application to reconstruction of discontinuity
- An introduction to support vector machines and other kernel-based learning methods.
- Convergence rates of kernel conjugate gradient for random design regression
- Detection of irregular points by regularization in numerical differentiation and application to edge detection
- Discrepancy based model selection in statistical inverse problems
- Distributed learning with regularized least squares
- Error bounds for tikhonov regularization in hilbert scales
- How general are general source conditions?
- Learning Bounds for Kernel Regression Using Effective Data Dimensionality
- Learning sets with separating kernels
- Learning theory estimates via integral operators and their approximations
- Nyström type subsampling analyzed as a regularized projection
- Optimal rates for spectral algorithms with least-squares regression over Hilbert spaces
- Regularization theory for ill-posed problems. Selected topics
- Regularized Nyström subsampling in regression and ranking problems under general smoothness assumptions
- Thresholded spectral algorithms for sparse approximations
- Tikhonov regularization with oversmoothing penalty for non-linear ill-posed problems in Hilbert scales
- Unregularized online learning algorithms with general loss functions
Cited in
(17)- The Goldenshluger-Lepski method for constrained least-squares estimators over RKHSs
- Regularized Nyström Subsampling in Covariate Shift Domain Adaptation Problems
- Nyström subsampling method for coefficient-based regularized regression
- Efficient kernel canonical correlation analysis using Nyström approximation
- Iterative kernel regression with preconditioning
- Moduli of smoothness, \(K\)-functionals and Jackson-type inequalities associated with Kernel function approximation in learning theory
- A multiscale RBF method for severely ill-posed problems on spheres
- Semi-discrete Tikhonov regularization in RKHS with large randomly distributed noise
- Nyström landmark sampling and regularized Christoffel functions
- Manifold regularization based on Nyström type subsampling
- Diversity sampling is an implicit regularization for kernel methods
- Regularized Nyström subsampling in regression and ranking problems under general smoothness assumptions
- Sketching with Spherical Designs for Noisy Data Fitting on Spheres
- Nyström type subsampling analyzed as a regularized projection
- Pairwise learning problems with regularization networks and Nyström subsampling approach
- Adaptive parameter selection for kernel ridge regression
- Kernel conjugate gradient methods with random projections
This page was built for publication: Analysis of regularized Nyström subsampling for regression functions of low smoothness
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5236751)