Universal Robust Regression via Maximum Mean Discrepancy

From MaRDI portal
Publication:6341807

arXiv2006.00840MaRDI QIDQ6341807FDOQ6341807


Authors: Pierre Alquier, Mathieu Gerber Edit this on Wikidata


Publication date: 1 June 2020

Abstract: Many modern datasets are collected automatically and are thus easily contaminated by outliers. This led to a regain of interest in robust estimation, including new notions of robustness such as robustness to adversarial contamination of the data. However, most robust estimation methods are designed for a specific model. Notably, many methods were proposed recently to obtain robust estimators in linear models (or generalized linear models), and a few were developed for very specific settings, for example beta regression or sample selection models. In this paper we develop a new approach for robust estimation in arbitrary regression models, based on Maximum Mean Discrepancy minimization. We build two estimators which are both proven to be robust to Huber-type contamination. We obtain a non-asymptotic error bound for one them and show that it is also robust to adversarial contamination, but this estimator is computationally more expensive to use in practice than the other one. As a by-product of our theoretical analysis of the proposed estimators we derive new results on kernel conditional mean embedding of distributions which are of independent interest.













This page was built for publication: Universal Robust Regression via Maximum Mean Discrepancy

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6341807)