Regularization networks with indefinite kernels
From MaRDI portal
(Redirected from Publication:1935751)
Recommendations
- Least square regression with indefinite kernels and coefficient regularization
- Coefficient-based regularized regression with indefinite kernels by unbounded sampling
- Coefficient-based \(l^q\)-regularized regression with indefinite kernels and unbounded sampling
- Learning rates of least-square regularized regression
- Learning with convex loss and indefinite kernels
Cited in
(29)- Analysis of regularized least squares ranking with centered reproducing kernel
- Kernel-based sparse regression with the correntropy-induced loss
- Coefficient-based regularized regression with indefinite kernels by unbounded sampling
- Optimal rates for coefficient-based regularized regression
- Distributed learning with indefinite kernels
- Mumford-Shah Regularizer with Contextual Feedback
- Nyström subsampling method for coefficient-based regularized regression
- Infinite-\(\sigma \) limits for Tikhonov regularization
- Indefinite kernel network with \(l^q\)-norm regularization
- Regularized least square algorithm with two kernels
- Least square regression with indefinite kernels and coefficient regularization
- Splines with non positive kernels
- Indefinite kernels in least squares support vector machines and principal component analysis
- Indefinite kernel discriminant analysis
- Guaranteed Classification via Regularized Similarity Learning
- A new realization of rational functions, with applications to linear combination interpolation, the Cuntz relations and kernel decompositions
- Regularization networks and support vector machines
- Coefficient-based regression with non-identical unbounded sampling
- Distributed robust regression with correntropy losses and regularization kernel networks
- Bias corrected regularization kernel method in ranking
- On the regularization of convolutional kernel tensors in neural networks
- Indefinite kernel network with dependent sampling
- Coefficient-based \(l^q\)-regularized regression with indefinite kernels and unbounded sampling
- Regression learning with non-identically and non-independently sampling
- Coefficient-based regularization network with variance loss for error
- Coefficient-based regularized distribution regression
- Optimality of regularized least squares ranking with imperfect kernels
- Learning with convex loss and indefinite kernels
- Iterative kernel regression with preconditioning
This page was built for publication: Regularization networks with indefinite kernels
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1935751)