Constructive analysis for least squares regression with generalized \(K\)-norm regularization
From MaRDI portal
Publication:1724159
DOI10.1155/2014/458459zbMath1472.62113OpenAlexW2110830488WikidataQ59038393 ScholiaQ59038393MaRDI QIDQ1724159
Publication date: 14 February 2019
Published in: Abstract and Applied Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1155/2014/458459
Linear regression; mixed models (62J05) Inference from stochastic processes and spectral analysis (62M15)
Related Items (1)
Cites Work
- Sharp learning rates of coefficient-based \(l^q\)-regularized regression with indefinite kernels
- Learning by nonsymmetric kernels with data dependent spaces and \(\ell^1\)-regularizer
- Concentration estimates for learning with \(\ell ^{1}\)-regularizer and data dependent hypothesis spaces
- Optimal learning rates for least squares regularized regression with unbounded sampling
- Multi-kernel regularized classifiers
- The covering number in learning theory
- Learning theory estimates for coefficient-based regularized regression
- Concentration estimates for learning with unbounded sampling
- Learning with sample dependent hypothesis spaces
- Learning rates of least-square regularized regression
- Learning theory estimates via integral operators and their approximations
- On the mathematical foundations of learning
- CONVERGENCE ANALYSIS OF COEFFICIENT-BASED REGULARIZATION UNDER MOMENT INCREMENTAL CONDITION
- Least-squares regularized regression with dependent samples andq-penalty
- Probability Inequalities for the Sum of Independent Random Variables
- Capacity of reproducing kernel spaces in learning theory
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Theory of Reproducing Kernels
- Unnamed Item
- Unnamed Item
This page was built for publication: Constructive analysis for least squares regression with generalized \(K\)-norm regularization