Learning and approximation by Gaussians on Riemannian manifolds
From MaRDI portal
Publication:960002
DOI10.1007/s10444-007-9049-0zbMath1156.68045WikidataQ115384796 ScholiaQ115384796MaRDI QIDQ960002
Publication date: 16 December 2008
Published in: Advances in Computational Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10444-007-9049-0
learning theory; approximation; Riemannian manifolds; reproducing kernel Hilbert spaces; Gaussian kernels; multi-kernel least square regularization scheme
68T05: Learning and adaptive systems in artificial intelligence
Related Items
Learning Rates of lq Coefficient Regularization Learning with Gaussian Kernel, Sampling and Stability, Bayesian manifold regression, Learning rates of regularized regression on the unit sphere, Learning sparse gradients for variable selection and dimension reduction, Learning gradients on manifolds, Geometry on probability spaces, Parzen windows for multi-class classification, Learning rates for regularized classifiers using multivariate polynomial kernels, Learning rates of multi-kernel regularized regression, High order Parzen windows and randomized sampling, Optimal regression rates for SVMs using Gaussian kernels, Minimax-optimal nonparametric regression in high dimensions, A universal envelope for Gaussian processes and their kernels, Approximating and learning by Lipschitz kernel on the sphere, Rademacher Chaos Complexities for Learning the Kernel Problem, SVM LEARNING AND Lp APPROXIMATION BY GAUSSIANS ON RIEMANNIAN MANIFOLDS
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Semi-supervised learning on Riemannian manifolds
- Model selection for regularized least-squares algorithm in learning theory
- Multi-kernel regularized classifiers
- The covering number in learning theory
- Regularization networks and support vector machines
- Fully online classification by regularization
- Consistency of spectral clustering
- Learning theory estimates via integral operators and their approximations
- Error bounds for learning the kernel
- Learning Theory
- Capacity of reproducing kernel spaces in learning theory
- Empirical graph Laplacian approximation of Laplace–Beltrami operators: Large sample results
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data
- Learning Theory
- Theory of Reproducing Kernels