Unified approach to coefficient-based regularized regression
From MaRDI portal
Publication:651513
DOI10.1016/j.camwa.2011.05.034zbMath1228.62044MaRDI QIDQ651513
Publication date: 18 December 2011
Published in: Computers \& Mathematics with Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.camwa.2011.05.034
learning rates; data dependent hypothesis spaces; \(\ell ^{2}\)-empirical covering number; \(l^{q}\)-regularizer
Related Items
Error Analysis of Coefficient-Based Regularized Algorithm for Density-Level Detection, Sharp learning rates of coefficient-based \(l^q\)-regularized regression with indefinite kernels, On the convergence rate of kernel-based sequential greedy regression, CONVERGENCE ANALYSIS OF COEFFICIENT-BASED REGULARIZATION UNDER MOMENT INCREMENTAL CONDITION
Cites Work
- Learning by nonsymmetric kernels with data dependent spaces and \(\ell^1\)-regularizer
- Least square regression with indefinite kernels and coefficient regularization
- Multi-kernel regularized classifiers
- A note on different covering numbers in learning theory.
- The covering number in learning theory
- Compactly supported positive definite radial functions
- Concentration estimates for learning with unbounded sampling
- Learning with sample dependent hypothesis spaces
- Learning rates of least-square regularized regression
- Shannon sampling. II: Connections to learning theory
- Least-squares regularized regression with dependent samples andq-penalty
- Learning Theory
- Capacity of reproducing kernel spaces in learning theory
- The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution
- Theory of Reproducing Kernels