Hauptnavigation

Piatkowski/2019c: Distributed Generative Modelling with Sub-Linear Communication Overhead

Bibtype Inproceedings
Bibkey Piatkowski/2019c
Author Piatkowski, Nico
Ls8autor Piatkowski, Nico
Editor Cellier, Peggy and Driessens, Kurt
Title Distributed Generative Modelling with Sub-Linear Communication Overhead
Booktitle Machine Learning and Knowledge Discovery in Databases - International Workshops of ECML PKDD 2019, Würzburg, Germany, September 16-20, 2019, Proceedings, Part I
Series Communications in Computer and Information Science
Volume 1167
Pages 281--292
Publisher Springer
Abstract Pushing machine learning towards the edge, often implies the restriction to ultra-low-power (ULP) devices with rather limited compute capabilities. Generative models estimate the data generating probability mass P which can in turn be used for various tasks, including simulation, prediction/forecasting, and novelty detection. Whenever the actual learning task is unknown at learning time or the task is allowed to change over time, learning a generative model is the only viable option. However, learning such models on resource constrained systems raises several challenges. Recent advances in exponential family learning allow us to estimate sophisticated models on highly resource-constrained systems. Nevertheless, the setting in which the training data is distributed among several devices in a network with presumably high communication costs has not yet been investigated. We close this gap by deriving and exploiting a new property of integer models. More precisely, we present a model averaging scheme whose communication complexity is sub linear w.r.t. the parameter dimension d, and provide an upper bound on the global loss. Experimental results on benchmark data show, that the aggregated model is often on par with the non-distributed global model.
Year 2019
Url https://dmle.iais.fraunhofer.de/papers/piatkowski2019distributed.pdf



  • Privacy Policy
  • Imprint