Constructive analysis for coefficient regularization regression algorithms
From MaRDI portal
Publication:491841
DOI10.1016/J.JMAA.2015.06.006zbMATH Open1326.62148OpenAlexW573944826MaRDI QIDQ491841FDOQ491841
Authors: Weilin Nie, Cheng Wang
Publication date: 19 August 2015
Published in: Journal of Mathematical Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jmaa.2015.06.006
Recommendations
- Constructive analysis for least squares regression with generalized \(K\)-norm regularization
- Learning rates for least square regressions with coefficient regularization
- On the optimized parameters of coefficient regularized regressions
- Least square regression with \(l^{p}\)-coefficient regularization
- Convergence analysis of coefficient-based regularization under moment incremental condition
least squares regressioncoefficient regularizationconstructive stepping-stone functionerror decomposition
Cites Work
- On the mathematical foundations of learning
- Title not available (Why is that?)
- Support vector machine soft margin classifiers: error analysis
- The covering number in learning theory
- Learning theory estimates for coefficient-based regularized regression
- Concentration estimates for learning with unbounded sampling
- Learning with sample dependent hypothesis spaces
- Learning rates of least-square regularized regression
- Learning theory estimates via integral operators and their approximations
- Convergence analysis of coefficient-based regularization under moment incremental condition
- Capacity of reproducing kernel spaces in learning theory
- Concentration estimates for learning with \(\ell ^{1}\)-regularizer and data dependent hypothesis spaces
- Optimal learning rates for least squares regularized regression with unbounded sampling
- Least square regression with indefinite kernels and coefficient regularization
- Unified approach to coefficient-based regularized regression
- Multi-kernel regularized classifiers
- Regularization schemes for minimum error entropy principle
- Indefinite kernel network with dependent sampling
- ONLINE LEARNING WITH MARKOV SAMPLING
- Sharp learning rates of coefficient-based \(l^q\)-regularized regression with indefinite kernels
- SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
- Learning by nonsymmetric kernels with data dependent spaces and \(\ell^1\)-regularizer
- Least-squares regularized regression with dependent samples and \(q\)-penalty
- Constructive analysis for least squares regression with generalized \(K\)-norm regularization
Cited In (9)
- Constructive analysis for least squares regression with generalized \(K\)-norm regularization
- Error analysis for \(l^q\)-coefficient regularized moving least-square regression
- A simpler approach to coefficient regularized support vector machines regression
- Least square regression with \(l^{p}\)-coefficient regularization
- Coefficient-based \(l^q\)-regularized regression with indefinite kernels and unbounded sampling
- Least square regression with coefficient regularization by gradient descent
- On the optimized parameters of coefficient regularized regressions
- Learning theory estimates for coefficient-based regularized regression
- Convergence analysis of coefficient-based regularization under moment incremental condition
This page was built for publication: Constructive analysis for coefficient regularization regression algorithms
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q491841)