Copyright Notice:

The documents distributed by this server have been provided by the contributing authors as a means to ensure timely dissemination of scholarly and technical work on a noncommercial basis. Copyright and all rights therein are maintained by the authors or by other copyright holders, notwithstanding that they have offered their works here electronically. It is understood that all persons copying this information will adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.

Publications of SPCL

O. Rausch, T. Ben-Nun, N. Dryden, A. Ivanov, S. Li, T. Hoefler:

 A Data-Centric Optimization Framework for Machine Learning

(In Proceedings of the 2022 International Conference on Supercomputing (ICS'22), Jul. 2022)

Publisher Reference

Abstract

Rapid progress in deep learning is leading to a diverse set of quickly changing models, with a dramatically growing demand for compute. However, as frameworks specialize performance optimization to patterns in popular networks, they implicitly constrain novel and diverse models that drive progress in research. We empower deep learning researchers by defining a flexible and user-customizable pipeline for optimizing training of arbitrary deep neural networks, based on data movement minimization. The pipeline begins with standard networks in PyTorch or ONNX and transforms computation through progressive lowering. We define four levels of general-purpose transformations, from local intra-operator optimizations to global data movement reduction. These operate on a data-centric graph intermediate representation that expresses computation and data movement at all levels of abstraction, including expanding basic operators such as convolutions to their underlying computations. Central to the design is the interactive and introspectable nature of the pipeline. Every part is extensible through a Python API, and can be tuned interactively using a GUI. We demonstrate competitive performance or speedups on ten different networks, with interactive optimizations discovering new opportunities in EfficientNet.

Documents

download article:
access preprint on arxiv:
download slides:


Recorded talk (best effort)

 

BibTeX

@inproceedings{,
  author={Oliver Rausch and Tal Ben-Nun and Nikoli Dryden and Andrei Ivanov and Shigang Li and Torsten Hoefler},
  title={{A Data-Centric Optimization Framework for Machine Learning}},
  year={2022},
  month={07},
  booktitle={Proceedings of the 2022 International Conference on Supercomputing (ICS'22)},
  doi={https://doi.org/10.1145/3524059.3532364},
}