Learning a tree-structured Ising model in order to make predictions

From MaRDI portal
Publication:2196190

DOI10.1214/19-AOS1808zbMATH Open1469.62334arXiv1604.06749MaRDI QIDQ2196190FDOQ2196190


Authors: Guy Bresler, Mina Karzand Edit this on Wikidata


Publication date: 28 August 2020

Published in: The Annals of Statistics (Search for Journal in Brave)

Abstract: We study the problem of learning a tree Ising model from samples such that subsequent predictions made using the model are accurate. The prediction task considered in this paper is that of predicting the values of a subset of variables given values of some other subset of variables. Virtually all previous work on graphical model learning has focused on recovering the true underlying graph. We define a distance ("small set TV" or ssTV) between distributions P and Q by taking the maximum, over all subsets mathcalS of a given size, of the total variation between the marginals of P and Q on mathcalS; this distance captures the accuracy of the prediction task of interest. We derive non-asymptotic bounds on the number of samples needed to get a distribution (from the same class) with small ssTV relative to the one generating the samples. One of the main messages of this paper is that far fewer samples are needed than for recovering the underlying tree, which means that accurate predictions are possible using the wrong tree.


Full work available at URL: https://arxiv.org/abs/1604.06749




Recommendations




Cites Work


Cited In (9)





This page was built for publication: Learning a tree-structured Ising model in order to make predictions

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2196190)