Learnability of Gaussians with flexible variances
From MaRDI portal
Publication:3174075
Recommendations
Cited in
(37)- Fast learning rate of non-sparse multiple kernel learning and optimal regularization strategies
- Classification with Gaussians and convex loss. II: Improving error bounds by noise conditions
- Online classification with varying Gaussians
- Learning rates for classification with Gaussian kernels
- scientific article; zbMATH DE number 7306897 (Why is no real title available?)
- Nonlinear approximation using Gaussian kernels
- Convergence analysis of online algorithms
- Some properties of Gaussian reproducing kernel Hilbert spaces and their implications for function approximation and learning theory
- Learning and approximation by Gaussians on Riemannian manifolds
- Error analysis on regularized regression based on the maximum correntropy criterion
- Learnability with respect to fixed distributions
- On extension theorems and their connection to universal consistency in machine learning
- Error bounds for \(l^p\)-norm multiple kernel learning with least square loss
- Multi-kernel regularized classifiers
- Distributed regularized least squares with flexible Gaussian kernels
- Orthogonality from disjoint support in reproducing kernel Hilbert spaces
- Unregularized online algorithms with varying Gaussians
- The optimal solution of multi-kernel regularization learning
- Convergence of online pairwise regression learning with quadratic loss
- Approximation of kernel matrices by circulant matrices and its application in kernel selection methods
- High order Parzen windows and randomized sampling
- Summation of Gaussian shifts as Jacobi's third theta function
- Learning performance of regularized regression with multiscale kernels based on Markov observations
- Conditional quantiles with varying Gaussians
- A note on support vector machines with polynomial kernels
- Learning with sample dependent hypothesis spaces
- Parzen windows for multi-class classification
- Refined Rademacher chaos complexity bounds with applications to the multikernel learning problem
- Optimal regression rates for SVMs using Gaussian kernels
- Quantitative convergence analysis of kernel based large-margin unified machines
- Learning the coordinate gradients
- Error Estimates for Multivariate Regression on Discretized Function Spaces
- Maximum correntropy criterion regression models with tending-to-zero scale parameters
- Least square regularized regression for multitask learning
- Optimal learning with Gaussians and correntropy loss
- Learning rates of multi-kernel regularized regression
- Error bounds for learning the kernel
This page was built for publication: Learnability of Gaussians with flexible variances
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3174075)