Non quia difficilia sunt non audemus, sed quia non audemus difficilia sunt
Home -> Publications
Home
  Publications
    
edited volumes
  Awards
  Research
  Teaching
  Miscellaneous
  Full CV [pdf]
  BLOG






  Events








  Past Events





Publications of Torsten Hoefler
Andrei Ivanov, Nikoli Dryden, Tal Ben-Nun, Shigang Li, Torsten Hoefler:

 Data Movement Is All You Need: A Case Study on Optimizing Transformers

(In Proceedings of Machine Learning and Systems 3 (MLSys 2021), Apr. 2021)
Outstanding Paper Award (5/52)

Abstract

Transformers have become widely used for language modeling and sequence learning tasks, and are one of the most important machine learning workloads today. Training one is a very compute-intensive task, often taking days or weeks, and significant attention has been given to optimizing transformers. Despite this, existing implementations do not efficiently utilize GPUs. We find that data movement is the key bottleneck when training. Due to Amdahl’s Law and massive improvements in compute performance, training has now become memory-bound. Further, existing frameworks use suboptimal data layouts. Using these insights, we present a recipe for globally optimizing data movement in transformers. We reduce data movement by up to 22.91% and overall achieve a 1.30× performance improvement over state-of-the-art frameworks when training BERT. Our approach is applicable more broadly to optimizing deep neural networks, and offers insight into how to tackle emerging performance bottlenecks.

Documents

download article:     


Recorded talk (best effort)

 

BibTeX

@inproceedings{data-movement-is-all-you-need,
  author={Andrei Ivanov and Nikoli Dryden and Tal Ben-Nun and Shigang Li and Torsten Hoefler},
  title={{Data Movement Is All You Need: A Case Study on Optimizing Transformers}},
  year={2021},
  month={Apr.},
  booktitle={Proceedings of Machine Learning and Systems 3 (MLSys 2021)},
  source={http://www.unixer.de/~htor/publications/},
}


serving: 54.196.27.171:37452© Torsten Hoefler