Orthogonal statistical learning

From MaRDI portal
Publication:6136574

DOI10.1214/23-AOS2258arXiv1901.09036MaRDI QIDQ6136574FDOQ6136574


Authors: Dylan J. Foster, Vasilis Syrgkanis Edit this on Wikidata


Publication date: 31 August 2023

Published in: The Annals of Statistics (Search for Journal in Brave)

Abstract: We provide non-asymptotic excess risk guarantees for statistical learning in a setting where the population risk with respect to which we evaluate the target parameter depends on an unknown nuisance parameter that must be estimated from data. We analyze a two-stage sample splitting meta-algorithm that takes as input arbitrary estimation algorithms for the target parameter and nuisance parameter. We show that if the population risk satisfies a condition called Neyman orthogonality, the impact of the nuisance estimation error on the excess risk bound achieved by the meta-algorithm is of second order. Our theorem is agnostic to the particular algorithms used for the target and nuisance and only makes an assumption on their individual performance. This enables the use of a plethora of existing results from machine learning to give new guarantees for learning with a nuisance component. Moreover, by focusing on excess risk rather than parameter estimation, we can provide rates under weaker assumptions than in previous works and accommodate settings in which the target parameter belongs to a complex nonparametric class. We provide conditions on the metric entropy of the nuisance and target classes such that oracle rates of the same order as if we knew the nuisance parameter are achieved.


Full work available at URL: https://arxiv.org/abs/1901.09036







Cites Work


Cited In (8)





This page was built for publication: Orthogonal statistical learning

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6136574)