Application of integral operator for regularized least-square regression

From MaRDI portal
Revision as of 19:56, 2 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:2389897


DOI10.1016/j.mcm.2008.08.005zbMath1165.45310MaRDI QIDQ2389897

Qiang Wu, Hongwei Sun

Publication date: 20 July 2009

Published in: Mathematical and Computer Modelling (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1016/j.mcm.2008.08.005


45P05: Integral operators


Related Items

Bias corrected regularization kernel method in ranking, Reproducing Properties of Differentiable Mercer-Like Kernels on the Sphere, REGULARIZED LEAST SQUARE ALGORITHM WITH TWO KERNELS, Unnamed Item, On the K-functional in learning theory, Learning performance of uncentered kernel-based principal component analysis, Approximation of eigenfunctions in kernel-based spaces, Reproducing kernel Hilbert spaces associated with kernels on topological spaces, Regularized least square regression with unbounded and dependent sampling, Consistency analysis of spectral regularization algorithms, Hierarchical least squares algorithms for single-input multiple-output systems based on the auxiliary model, Least square regression with indefinite kernels and coefficient regularization, The convergence rate of a regularized ranking algorithm, Debiased magnitude-preserving ranking: learning rate and bias characterization, An extension of Mercer's theory to \(L^p\), Learning rates for the kernel regularized regression with a differentiable strongly convex loss, Learning performance of regularized regression with multiscale kernels based on Markov observations, Half supervised coefficient regularization for regression learning with unbounded sampling, Coefficient regularized regression with non-iid sampling, Reproducing properties of differentiable Mercer-like kernels, LEAST SQUARE REGRESSION WITH COEFFICIENT REGULARIZATION BY GRADIENT DESCENT, Regression learning with non-identically and non-independently sampling



Cites Work