A Reproducing Kernel Hilbert Space Framework for Functional Classification

From MaRDI portal
Publication:6180729

DOI10.1080/10618600.2022.2138407arXiv2103.04504MaRDI QIDQ6180729FDOQ6180729


Authors: Peijun Sang, Adam B. Kashlak, Linglong Kong Edit this on Wikidata


Publication date: 22 January 2024

Published in: Journal of Computational and Graphical Statistics (Search for Journal in Brave)

Abstract: We encounter a bottleneck when we try to borrow the strength of classical classifiers to classify functional data. The major issue is that functional data are intrinsically infinite dimensional, thus classical classifiers cannot be applied directly or have poor performance due to the curse of dimensionality. To address this concern, we propose to project functional data onto one specific direction, and then a distance-weighted discrimination DWD classifier is built upon the projection score. The projection direction is identified through minimizing an empirical risk function that contains the particular loss function in a DWD classifier, over a reproducing kernel Hilbert space. Hence our proposed classifier can avoid overfitting and enjoy appealing properties of DWD classifiers. This framework is further extended to accommodate functional data classification problems where scalar covariates are involved. In contrast to previous work, we establish a non-asymptotic estimation error bound on the relative misclassification rate. In finite sample case, we demonstrate that the proposed classifiers compare favorably with some commonly used functional classifiers in terms of prediction accuracy through simulation studies and a real-world application.


Full work available at URL: https://arxiv.org/abs/2103.04504







Cites Work






This page was built for publication: A Reproducing Kernel Hilbert Space Framework for Functional Classification

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6180729)