Nonparametric stochastic approximation with large step-sizes

From MaRDI portal
Publication:309706

DOI10.1214/15-AOS1391zbMATH Open1346.60041arXiv1408.0361OpenAlexW2964198904MaRDI QIDQ309706FDOQ309706


Authors: Aymeric Dieuleveut, Francis Bach Edit this on Wikidata


Publication date: 7 September 2016

Published in: The Annals of Statistics (Search for Journal in Brave)

Abstract: We consider the random-design least-squares regression problem within the reproducing kernel Hilbert space (RKHS) framework. Given a stream of independent and identically distributed input/output data, we aim to learn a regression function within an RKHS mathcalH, even if the optimal predictor (i.e., the conditional expectation) is not in mathcalH. In a stochastic approximation framework where the estimator is updated after each observation, we show that the averaged unregularized least-mean-square algorithm (a form of stochastic gradient), given a sufficient large step-size, attains optimal rates of convergence for a variety of regimes for the smoothnesses of the optimal prediction function and the functions in mathcalH.


Full work available at URL: https://arxiv.org/abs/1408.0361




Recommendations




Cites Work


Cited In (47)

Uses Software





This page was built for publication: Nonparametric stochastic approximation with large step-sizes

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q309706)