Error analysis of kernel regularized pairwise learning with a strongly convex loss
From MaRDI portal
Publication:6112862
DOI10.3934/mfc.2022030zbMath1527.68198OpenAlexW4294081359MaRDI QIDQ6112862
Publication date: 7 August 2023
Published in: Mathematical Foundations of Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3934/mfc.2022030
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Generalization bounds for metric and similarity learning
- Consistency analysis of an empirical minimum error entropy algorithm
- On the robustness of regularized pairwise learning methods based on kernels
- The learning rate of \(l_2\)-coefficient regularized classification with strong loss
- The convergence rate of a regularized ranking algorithm
- Multi-kernel regularized classifiers
- Behavior of a functional in learning theory
- Learning rates for least square regressions with coefficient regularization
- Theory of deep convolutional neural networks: downsampling
- Learning rates for the kernel regularized regression with a differentiable strongly convex loss
- Convergence of online pairwise regression learning with quadratic loss
- Kernel gradient descent algorithm for information theoretic learning
- Learning sets with separating kernels
- Universality of deep convolutional neural networks
- Regularized ranking with convex losses and \(\ell^1\)-penalty
- Online regularized learning with pairwise loss functions
- Ranking and empirical minimization of \(U\)-statistics
- Error analysis on regularized regression based on the maximum correntropy criterion
- Learning Theory
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Deep distributed convolutional neural networks: Universality
- Semi-supervised learning for regression based on the diffusion matrix
- Coefficient-based regularization network with variance loss for error
- Robust kernel-based distribution regression
- REGULARIZED LEAST SQUARE REGRESSION WITH SPHERICAL POLYNOMIAL KERNELS
- Online regularized pairwise learning with least squares loss
- Convergence analysis of distributed multi-penalty regularized pairwise learning
- Regularization schemes for minimum error entropy principle
- Probability Inequalities for Sums of Bounded Random Variables
- Learning rates for regularized least squares ranking algorithm
- Online Pairwise Learning Algorithms
- Online minimum error entropy algorithm with unbounded sampling
- Convex analysis and monotone operator theory in Hilbert spaces
- On the convergence rate and some applications of regularized ranking algorithms
- Theory of deep convolutional neural networks. III: Approximating radial functions
This page was built for publication: Error analysis of kernel regularized pairwise learning with a strongly convex loss