Campus Units

Computer Science, Electrical and Computer Engineering

Document Type

Article

Publication Date

2017

Journal or Book Title

arXiv

Abstract

A current challenge for data management systems is to support the construction and maintenance of machine learning models over data that is large, multi-dimensional, and evolving. While systems that could support these tasks are emerging, the need to scale to distributed, streaming data requires new models and algorithms. In this setting, as well as computational scalability and model accuracy, we also need to minimize the amount of communication between distributed processors, which is the chief component of latency. We study Bayesian networks, the workhorse of graphical models, and present a communication-efficient method for continuously learning and maintaining a Bayesian network model over data that is arriving as a distributed stream partitioned across multiple processors. We show a strategy for maintaining model parameters that leads to an exponential reduction in communication when compared with baseline approaches to maintain the exact MLE (maximum likelihood estimation). Meanwhile, our strategy provides similar prediction errors for the target distribution and for classification tasks.

Comments

This is a manuscript of the article Zhang, Yu, Srikanta Tirthapura, and Graham Cormode. "Learning Graphical Models from a Distributed Stream." arXiv preprint arXiv:1710.02103 (2017). Posted with permission.

Language

en

File Format

application/pdf

Share

COinS