Error bounds for \(l^p\)-norm multiple kernel learning with least square loss
From MaRDI portal
Publication:448851
DOI10.1155/2012/915920zbMath1280.68177OpenAlexW1985786749WikidataQ58697082 ScholiaQ58697082MaRDI QIDQ448851
Publication date: 7 September 2012
Published in: Abstract and Applied Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1155/2012/915920
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (3)
An efficient kernel learning algorithm for semisupervised regression problems ⋮ On the convergence rate of kernel-based sequential greedy regression ⋮ The learning rates of regularized regression based on reproducing kernel Banach spaces
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Consistency analysis of spectral regularization algorithms
- Sparsity in penalized empirical risk minimization
- Multi-kernel regularized classifiers
- A note on application of integral operator in learning theory
- Weak convergence and empirical processes. With applications to statistics
- Learning rates of least-square regularized regression
- Shannon sampling. II: Connections to learning theory
- Learning theory estimates via integral operators and their approximations
- On the mathematical foundations of learning
- Error bounds for learning the kernel
- Statistical Learning Theory: Models, Concepts, and Results
- Capacity of reproducing kernel spaces in learning theory
- Graph-Based Semi-Supervised Learning and Spectral Kernel Design
- 10.1162/153244302760200704
- Model Selection and Estimation in Regression with Grouped Variables
- Theory of Reproducing Kernels
- The elements of statistical learning. Data mining, inference, and prediction
This page was built for publication: Error bounds for \(l^p\)-norm multiple kernel learning with least square loss