Learning and approximation by Gaussians on Riemannian manifolds
From MaRDI portal
(Redirected from Publication:960002)
Recommendations
- SVM LEARNING AND Lp APPROXIMATION BY GAUSSIANS ON RIEMANNIAN MANIFOLDS
- Learnability of Gaussians with flexible variances
- Empirical graph Laplacian approximation of Laplace–Beltrami operators: Large sample results
- Nonparametric Regression between General Riemannian Manifolds
- Approximating and learning by Lipschitz kernel on the sphere
Cites work
- scientific article; zbMATH DE number 3960432 (Why is no real title available?)
- scientific article; zbMATH DE number 4090083 (Why is no real title available?)
- scientific article; zbMATH DE number 52737 (Why is no real title available?)
- Capacity of reproducing kernel spaces in learning theory
- Consistency of spectral clustering
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Empirical graph Laplacian approximation of Laplace–Beltrami operators: Large sample results
- Error bounds for learning the kernel
- Fully online classification by regularization
- Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Learnability of Gaussians with flexible variances
- Learning Theory
- Learning Theory
- Learning theory estimates via integral operators and their approximations
- Model selection for regularized least-squares algorithm in learning theory
- Multi-kernel regularized classifiers
- Regularization networks and support vector machines
- SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
- Semi-supervised learning on Riemannian manifolds
- Support vector machine soft margin classifiers: error analysis
- The covering number in learning theory
- Theory of Reproducing Kernels
Cited in
(29)- Learning rates for regularized classifiers using multivariate polynomial kernels
- Bayesian manifold regression
- Learning rates for classification with Gaussian kernels
- Minimax-optimal nonparametric regression in high dimensions
- Semi-supervised learning on Riemannian manifolds
- Learning sparse gradients for variable selection and dimension reduction
- scientific article; zbMATH DE number 6999890 (Why is no real title available?)
- SVM LEARNING AND Lp APPROXIMATION BY GAUSSIANS ON RIEMANNIAN MANIFOLDS
- Deep neural networks for rotation-invariance approximation and learning
- High order Parzen windows and randomized sampling
- Stochastic Gradient Descent on Riemannian Manifolds
- Rademacher Chaos Complexities for Learning the Kernel Problem
- Approximating and learning by Lipschitz kernel on the sphere
- A universal envelope for Gaussian processes and their kernels
- Intrinsic dimension adaptive partitioning for kernel methods
- Semi-supervised learning based on high density region estimation
- Solving PDEs on spheres with physics-informed convolutional neural networks
- Parzen windows for multi-class classification
- Adaptive learning rates for support vector machines working on data with low intrinsic dimension
- Optimal regression rates for SVMs using Gaussian kernels
- Learning rates of regularized regression on the unit sphere
- A deep network construction that adapts to intrinsic dimensionality beyond the domain
- Multiscale regression on unknown manifolds
- Gaussian Distributions on Riemannian Symmetric Spaces: Statistical Learning With Structured Covariance Matrices
- Learning gradients on manifolds
- Learning Rates of lq Coefficient Regularization Learning with Gaussian Kernel
- Geometry on probability spaces
- Learning rates of multi-kernel regularized regression
- Sampling and Stability
This page was built for publication: Learning and approximation by Gaussians on Riemannian manifolds
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q960002)