On kernel methods for covariates that are rankings (Q1657968)

From MaRDI portal
scientific article
Language Label Description Also known as
English
On kernel methods for covariates that are rankings
scientific article

    Statements

    On kernel methods for covariates that are rankings (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    14 August 2018
    0 references
    Support-vector machines (SVM) are among the most powerful supervised learning models and are based on the Vapnik-Chervonenkis theory of statistical learning proposed in late 1960s. Since recently, SVM is also used for data ranking. Reproducing Kernel Hilbert Space (RKHS) is in the heart of SVM theory and enables efficient ``rectifying'' space construction, ref. the terminology first proposed by \textit{M. A. Aĭzerman} et al. [Autom. Remote Control 25, 821--837 (1965; Zbl 0151.24701); translation from Avtom. Telemekh. 25, 917--936 (1964)]. The simple linear classifier in RKHS like PCA, ridge regression can be applied after such mapping. This paper focuses on right-invariant Kendall's and Mallows' kernels since they are invariant to a re-indexing of the underlying objects. The feature maps and spectral properties of these kernels are analyzed, new kernels based on these kernels' interpolation are constructed. Eurobarometer and Movielens ratings datasets are used to demonstrate the effectiveness of the proposed novel ranking methods using SVM with both the Kendall and the Mallows kernels using conventional cross-validation for the parameters selection.
    0 references
    Mallows kernel
    0 references
    Kendall kernel
    0 references
    polynomial kernel
    0 references
    representation theory
    0 references
    Fourier analysis
    0 references
    symmetric group
    0 references
    ridge regression
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references