A Large Dimensional Analysis of Least Squares Support Vector Machines

From MaRDI portal
Publication:4628262

DOI10.1109/TSP.2018.2889954zbMATH Open1414.62250arXiv1701.02967WikidataQ128626412 ScholiaQ128626412MaRDI QIDQ4628262FDOQ4628262


Authors: Zhenyu Liao, R. Couillet Edit this on Wikidata


Publication date: 6 March 2019

Published in: IEEE Transactions on Signal Processing (Search for Journal in Brave)

Abstract: In this article, a large dimensional performance analysis of kernel least squares support vector machines (LS-SVMs) is provided under the assumption of a two-class Gaussian mixture model for the input data. Building upon recent advances in random matrix theory, we show, when the dimension of data p and their number n are both large, that the LS-SVM decision function can be well approximated by a normally distributed random variable, the mean and variance of which depend explicitly on a local behavior of the kernel function. This theoretical result is then applied to the MNIST and Fashion-MNIST datasets which, despite their non-Gaussianity, exhibit a convincingly close behavior. Most importantly, our analysis provides a deeper understanding of the mechanism into play in SVM-type methods and in particular of the impact on the choice of the kernel function as well as some of their theoretical limits in separating high dimensional Gaussian vectors.


Full work available at URL: https://arxiv.org/abs/1701.02967




Recommendations




Cited In (9)





This page was built for publication: A Large Dimensional Analysis of Least Squares Support Vector Machines

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4628262)