Learning rates for the risk of kernel-based quantile regression estimators in additive models

From MaRDI portal
Publication:2805231

DOI10.1142/S0219530515500050zbMATH Open1338.62077arXiv1405.3379MaRDI QIDQ2805231FDOQ2805231

Ding-Xuan Zhou, Andreas Christmann

Publication date: 10 May 2016

Published in: Analysis and Applications (Singapore) (Search for Journal in Brave)

Abstract: Additive models play an important role in semiparametric statistics. This paper gives learning rates for regularized kernel based methods for additive models. These learning rates compare favourably in particular in high dimensions to recent results on optimal learning rates for purely nonparametric regularized kernel based quantile regression using the Gaussian radial basis function kernel, provided the assumption of an additive model is valid. Additionally, a concrete example is presented to show that a Gaussian function depending only on one variable lies in a reproducing kernel Hilbert space generated by an additive Gaussian kernel, but does not belong to the reproducing kernel Hilbert space generated by the multivariate Gaussian kernel of the same variance.


Full work available at URL: https://arxiv.org/abs/1405.3379




Recommendations




Cites Work


Cited In (16)

Uses Software





This page was built for publication: Learning rates for the risk of kernel-based quantile regression estimators in additive models

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2805231)