Refined Rademacher chaos complexity bounds with applications to the multikernel learning problem
From MaRDI portal
Publication:5378343
Recommendations
Cites work
- scientific article; zbMATH DE number 5957287 (Why is no real title available?)
- scientific article; zbMATH DE number 1254560 (Why is no real title available?)
- scientific article; zbMATH DE number 1950575 (Why is no real title available?)
- 10.1162/153244302760200713
- 10.1162/153244303321897690
- Advanced Lectures on Machine Learning
- An introduction to support vector machines and other kernel-based learning methods.
- Convexity, Classification, and Risk Bounds
- Error Analysis of Coefficient-Based Regularized Algorithm for Density-Level Detection
- Fast rates for support vector machines using Gaussian kernels
- Learnability of Gaussians with flexible variances
- Learning Bounds for Support Vector Machines with Learned Kernels
- Learning Theory
- Learning rates of multi-kernel regularized regression
- Learning the kernel function via regularization
- Learning the kernel matrix with semidefinite programming
- Multi-kernel regularized classifiers
- Neural Network Learning
- New approaches to statistical learning theory
- On ranking and generalization bounds
- On the mathematical foundations of learning
- Rademacher Chaos Complexities for Learning the Kernel Problem
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Support vector machine soft margin classifiers: error analysis
- The covering number in learning theory
- Uniform Central Limit Theorems
- Weak convergence and empirical processes. With applications to statistics
Cited in
(2)
This page was built for publication: Refined Rademacher chaos complexity bounds with applications to the multikernel learning problem
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5378343)