Generalization Error Bounds via Rényi-, f-Divergences and Maximal Leakage
From MaRDI portal
Publication:4958201
DOI10.1109/TIT.2021.3085190zbMATH Open1486.94038arXiv1912.01439OpenAlexW3170524081MaRDI QIDQ4958201FDOQ4958201
Authors: Amedeo Roberto Esposito, Michael C. Gastpar, Ibrahim Issa Edit this on Wikidata
Publication date: 7 September 2021
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: In this work, the probability of an event under some joint distribution is bounded by measuring it with the product of the marginals instead (which is typically easier to analyze) together with a measure of the dependence between the two random variables. These results find applications in adaptive data analysis, where multiple dependencies are introduced and in learning theory, where they can be employed to bound the generalization error of a learning algorithm. Bounds are given in terms of Sibson's Mutual Information, Divergences, Hellinger Divergences, and Divergences. A case of particular interest is the Maximal Leakage (or Sibson's Mutual Information of order infinity), since this measure is robust to post-processing and composes adaptively. The corresponding bound can be seen as a generalization of classical bounds, such as Hoeffding's and McDiarmid's inequalities, to the case of dependent random variables.
Full work available at URL: https://arxiv.org/abs/1912.01439
Cited In (2)
This page was built for publication: Generalization Error Bounds via Rényi-, f-Divergences and Maximal Leakage
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4958201)