Optimal learning with Bernstein Online Aggregation

From MaRDI portal
Publication:72768

DOI10.48550/ARXIV.1404.1356zbMATH Open1412.68196arXiv1404.1356OpenAlexW1835684196MaRDI QIDQ72768FDOQ72768

Olivier Wintenberger, Olivier Wintenberger

Publication date: 4 April 2014

Published in: Machine Learning (Search for Journal in Brave)

Abstract: We introduce a new recursive aggregation procedure called Bernstein Online Aggregation (BOA). The exponential weights include an accuracy term and a second order term that is a proxy of the quadratic variation as in Hazan and Kale (2010). This second term stabilizes the procedure that is optimal in different senses. We first obtain optimal regret bounds in the deterministic context. Then, an adaptive version is the first exponential weights algorithm that exhibits a second order bound with excess losses that appears first in Gaillard et al. (2014). The second order bounds in the deterministic context are extended to a general stochastic context using the cumulative predictive risk. Such conversion provides the main result of the paper, an inequality of a novel type comparing the procedure with any deterministic aggregation procedure for an integrated criteria. Then we obtain an observable estimate of the excess of risk of the BOA procedure. To assert the optimality, we consider finally the iid case for strongly convex and Lipschitz continuous losses and we prove that the optimal rate of aggregation of Tsybakov (2003) is achieved. The batch version of the BOA procedure is then the first adaptive explicit algorithm that satisfies an optimal oracle inequality with high probability.


Full work available at URL: https://arxiv.org/abs/1404.1356





Cites Work


Cited In (9)






This page was built for publication: Optimal learning with Bernstein Online Aggregation

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q72768)