Optimality of regularized least squares ranking with imperfect kernels
From MaRDI portal
Publication:6125450
DOI10.1016/j.ins.2021.12.087OpenAlexW4206483643MaRDI QIDQ6125450
Yu Zeng, Qiang Wu, Lie Zheng, Fangchao He
Publication date: 11 April 2024
Published in: Information Sciences (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.ins.2021.12.087
Computer science (68-XX) Game theory, economics, finance, and other social and behavioral sciences (91-XX)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Learning rate of support vector machine for ranking
- Generalization performance of bipartite ranking algorithms with convex losses
- Optimal rates for regularization of statistical inverse learning problems
- The convergence rate of a regularized ranking algorithm
- Debiased magnitude-preserving ranking: learning rate and bias characterization
- On regularization algorithms in learning theory
- Regularization networks with indefinite kernels
- Optimal rates for the regularized least-squares algorithm
- Application of integral operator for regularized least-square regression
- Shannon sampling. II: Connections to learning theory
- Learning theory estimates via integral operators and their approximations
- Error bounds for learning the kernel
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Bias corrected regularization kernel method in ranking
- 10.1162/153244302760200704
- Leave-One-Out Bounds for Kernel Methods
- Learning rates for regularized least squares ranking algorithm
- Theory of Reproducing Kernels
- On the convergence rate and some applications of regularized ranking algorithms
This page was built for publication: Optimality of regularized least squares ranking with imperfect kernels