Optimal learning rates of \(l^p\)-type multiple kernel learning under general conditions
From MaRDI portal
Publication:526680
DOI10.1016/j.ins.2014.09.011zbMath1360.68691MaRDI QIDQ526680
Publication date: 15 May 2017
Published in: Information Sciences (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.ins.2014.09.011
generalization ability; correlation measure; multiple kernel learning; local Rademacher complexity; kernel learning
68T05: Learning and adaptive systems in artificial intelligence
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A fast algorithm for manifold learning by posing it as a symmetric diagonally dominant linear system
- Sparsity in multiple kernel learning
- Evolutionary combination of kernels for nonlinear feature transformation
- Sparsity in penalized empirical risk minimization
- Multi-kernel regularized classifiers
- Convex multi-task feature learning
- Optimal rates for the regularized least-squares algorithm
- Statistical performance of support vector machines
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Learning rates of least-square regularized regression
- Local Rademacher complexities
- On the mathematical foundations of learning
- Support Vector Machines
- A Reproducing Kernel Hilbert Space Framework for Spike Train Signal Processing
- Advanced Lectures on Machine Learning
- Choosing multiple parameters for support vector machines