Copyright Notice:

The documents distributed by this server have been provided by the contributing authors as a means to ensure timely dissemination of scholarly and technical work on a noncommercial basis. Copyright and all rights therein are maintained by the authors or by other copyright holders, notwithstanding that they have offered their works here electronically. It is understood that all persons copying this information will adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.

Publications of SPCL

T. Ben-Nun, T. Hoefler:

 Demystifying Parallel and Distributed Deep Learning: An In-Depth Concurrency Analysis

(ACM Comput. Surv.. Vol 52, Nr. 4, pages 65:1--65:43, ACM, ISSN: 0360-0300, Aug. 2019)

Abstract

Deep Neural Networks (DNNs) are becoming an important tool in modern computing applications. Accelerating their training is a major challenge and techniques range from distributed algorithms to low-level circuit design. In this survey, we describe the problem from a theoretical perspective, followed by approaches for its parallelization. Specifically, we present trends in DNN architectures and the resulting implications on parallelization strategies. We discuss the different types of concurrency in DNNs; synchronous and asynchronous stochastic gradient descent; distributed system architectures; communication schemes; and performance modeling. Based on these approaches, we extrapolate potential directions for parallelism in deep learning.

ACM Stats

http://doi.acm.org/10.1145/3320060

Documents

download article:
access preprint on arxiv:
download slides:


Recorded talk (best effort)

 

BibTeX

@article{distdl-preprint,
  author={Tal Ben-Nun and Torsten Hoefler},
  title={{Demystifying Parallel and Distributed Deep Learning: An In-Depth Concurrency Analysis}},
  journal={ACM Comput. Surv.},
  year={2019},
  month={08},
  pages={65:1--65:43},
  volume={52},
  number={4},
  publisher={ACM},
  issn={0360-0300},
  doi={10.1145/3320060},
}