Learning rates for kernel-based expectile regression

From MaRDI portal
Publication:669274

DOI10.1007/S10994-018-5762-9zbMATH Open1480.62067arXiv1702.07552OpenAlexW2592603639WikidataQ129210583 ScholiaQ129210583MaRDI QIDQ669274FDOQ669274


Authors: Ingo Steinwart, M. U. Farooq Edit this on Wikidata


Publication date: 15 March 2019

Published in: Machine Learning (Search for Journal in Brave)

Abstract: Conditional expectiles are becoming an increasingly important tool in finance as well as in other areas of applications. We analyse a support vector machine type approach for estimating conditional expectiles and establish learning rates that are minimax optimal modulo a logarithmic factor if Gaussian RBF kernels are used and the desired expectile is smooth in a Besov sense. As a special case, our learning rates improve the best known rates for kernel-based least squares regression in this scenario. Key ingredients of our statistical analysis are a general calibration inequality for the asymmetric least squares loss, a corresponding variance bound as well as an improved entropy number bound for Gaussian RBF kernels.


Full work available at URL: https://arxiv.org/abs/1702.07552




Recommendations




Cites Work


Cited In (13)

Uses Software





This page was built for publication: Learning rates for kernel-based expectile regression

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q669274)