Learning by nonsymmetric kernels with data dependent spaces and \(\ell^1\)-regularizer
From MaRDI portal
Publication:543025
DOI10.11650/twjm/1500406018zbMath1221.68204OpenAlexW4250071731MaRDI QIDQ543025
Publication date: 21 June 2011
Published in: Taiwanese Journal of Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.11650/twjm/1500406018
learning theoryerror analysisregularization schemedata dependent hypothesis spacesnonsymmetric kernel
General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (26)
Optimality of the rescaled pure greedy learning algorithms ⋮ Distributed learning with partial coefficients regularization ⋮ Sharp learning rates of coefficient-based \(l^q\)-regularized regression with indefinite kernels ⋮ Kernel-based sparse regression with the correntropy-induced loss ⋮ Learning with coefficient-based regularization and \(\ell^1\)-penalty ⋮ Distributed semi-supervised regression learning with coefficient regularization ⋮ Learning rates for least square regressions with coefficient regularization ⋮ Least squares regression with \(l_1\)-regularizer in sum space ⋮ On reproducing kernel Banach spaces: generic definitions and unified framework of constructions ⋮ Least square regression with indefinite kernels and coefficient regularization ⋮ Learning theory approach to a system identification problem involving atomic norm ⋮ On the convergence rate of kernel-based sequential greedy regression ⋮ Learning with Convex Loss and Indefinite Kernels ⋮ Convergence rate of the semi-supervised greedy algorithm ⋮ Kernelized Elastic Net Regularization: Generalization Bounds, and Sparse Recovery ⋮ Unified approach to coefficient-based regularized regression ⋮ A simpler approach to coefficient regularized support vector machines regression ⋮ Indefinite kernel network with \(l^q\)-norm regularization ⋮ Constructive analysis for least squares regression with generalized \(K\)-norm regularization ⋮ Constructive analysis for coefficient regularization regression algorithms ⋮ Classification with polynomial kernels and \(l^1\)-coefficient regularization ⋮ Support vector machines regression with \(l^1\)-regularizer ⋮ On empirical eigenfunction-based ranking with \(\ell^1\) norm regularization ⋮ Concentration estimates for learning with \(\ell ^{1}\)-regularizer and data dependent hypothesis spaces ⋮ Coefficient-based \(l^q\)-regularized regression with indefinite kernels and unbounded sampling ⋮ Regularized ranking with convex losses and \(\ell^1\)-penalty
This page was built for publication: Learning by nonsymmetric kernels with data dependent spaces and \(\ell^1\)-regularizer