Learning of Tree-Structured Gaussian Graphical Models on Distributed Data Under Communication Constraints
From MaRDI portal
Publication:4628161
Abstract: In this paper, learning of tree-structured Gaussian graphical models from distributed data is addressed. In our model, samples are stored in a set of distributed machines where each machine has access to only a subset of features. A central machine is then responsible for learning the structure based on received messages from the other nodes. We present a set of communication efficient strategies, which are theoretically proved to convey sufficient information for reliable learning of the structure. In particular, our analyses show that even if each machine sends only the signs of its local data samples to the central node, the tree structure can still be recovered with high accuracy. Our simulation results on both synthetic and real-world datasets show that our strategies achieve a desired accuracy in inferring the underlying structure, while spending a small budget on communication.
Recommendations
- Distributed Covariance Estimation in Gaussian Graphical Models
- Consistency in models for distributed learning under communication constraints
- Learning Theory
- A Framework for Learning from Distributed Data Using Sufficient Statistics and Its Application to Learning Decision Trees
- Distributed Inference in Tree Networks Using Coding Theory
- Communication-efficient distributed statistical inference
- Learning distributed bayesian network structure using majority-based method
- Testing the Structure of a Gaussian Graphical Model With Reduced Transmissions in a Distributed Setting
- Node-based learning of multiple Gaussian graphical models
This page was built for publication: Learning of Tree-Structured Gaussian Graphical Models on Distributed Data Under Communication Constraints
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4628161)