SVM LEARNING AND Lp APPROXIMATION BY GAUSSIANS ON RIEMANNIAN MANIFOLDS
DOI10.1142/S0219530509001384zbMATH Open1175.68346OpenAlexW4233058794WikidataQ115245537 ScholiaQ115245537MaRDI QIDQ3395345FDOQ3395345
Authors: Gui-Bo Ye, Ding-Xuan Zhou
Publication date: 26 August 2009
Published in: Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1142/s0219530509001384
Recommendations
manifold learningreproducing kernel Hilbert spacesapproximationGaussian kernelsgeneral loss functionmulti-kernel regularized classifier
General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05) Methods of global Riemannian geometry, including PDE methods; curvature restrictions (53C21)
Cites Work
- Regularization networks and support vector machines
- Learning Theory
- Consistency of spectral clustering
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Sobolev spaces on Riemannian manifolds
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Empirical graph Laplacian approximation of Laplace–Beltrami operators: Large sample results
- Semi-supervised learning on Riemannian manifolds
- Learning and approximation by Gaussians on Riemannian manifolds
- The covering number in learning theory
- Learning theory estimates via integral operators and their approximations
- Capacity of reproducing kernel spaces in learning theory
- Multi-kernel regularized classifiers
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Model selection for regularized least-squares algorithm in learning theory
- SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
- Fully online classification by regularization
- Learning theory: from regression to classification
Cited In (12)
- Normal estimation on manifolds by gradient learning.
- Concentration estimates for the moving least-square method in learning theory
- Concentration estimates for learning with \(\ell ^{1}\)-regularizer and data dependent hypothesis spaces
- Learning and approximation by Gaussians on Riemannian manifolds
- Optimal learning rates for least squares regularized regression with unbounded sampling
- Title not available (Why is that?)
- ERM learning algorithm for multi-class classification
- Convergence rate of kernel canonical correlation analysis
- Intrinsic dimension adaptive partitioning for kernel methods
- Solving PDEs on spheres with physics-informed convolutional neural networks
- Adaptive learning rates for support vector machines working on data with low intrinsic dimension
- Integral operator approach to learning theory with unbounded sampling
This page was built for publication: SVM LEARNING AND Lp APPROXIMATION BY GAUSSIANS ON RIEMANNIAN MANIFOLDS
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3395345)