High Probability Lower Bounds for the Total Variation Distance

From MaRDI portal
Publication:137286

DOI10.48550/ARXIV.2005.06006arXiv2005.06006MaRDI QIDQ137286FDOQ137286


Authors: Loris Michel, Jeffrey Näf, Nicolai Meinshausen Edit this on Wikidata


Publication date: 12 May 2020

Abstract: The statistics and machine learning communities have recently seen a growing interest in classification-based approaches to two-sample testing. The outcome of a classification-based two-sample test remains a rejection decision, which is not always informative since the null hypothesis is seldom strictly true. Therefore, when a test rejects, it would be beneficial to provide an additional quantity serving as a refined measure of distributional difference. In this work, we introduce a framework for the construction of high-probability lower bounds on the total variation distance. These bounds are based on a one-dimensional projection, such as a classification or regression method, and can be interpreted as the minimal fraction of samples pointing towards a distributional difference. We further derive asymptotic power and detection rates of two proposed estimators and discuss potential uses through an application to a reanalysis climate dataset.








Cited In (1)





This page was built for publication: High Probability Lower Bounds for the Total Variation Distance

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q137286)