Write a Blog >>
PLDI 2020
Mon 15 - Fri 19 June 2020
Tue 16 Jun 2020 12:00 - 12:30 at MAPL live stream - Compilers for Deep Learning Frameworks Chair(s): Charles Sutton

The growing interest in deep learning has created a demand to compile computation graphs to accelerate execution and to deploy applications on various devices. Modern deep learning frameworks construct computation graphs dynamically. This gives rise to the problem of inferring types of dynamic computation graphs. Two approaches are known. One of them is a dynamic approach that constructs the computation graph from the execution trace of an actual input. While this approach is straightforward, the results of the shape inference will consist of only concrete values and is often difficult for users to interpret. The other one performs static analysis over the source program. This method can produce symbolic shape inference results but suffers from the dynamic nature of the host programming language Python.

In this paper, we propose a novel approach for type, shape, and symbolic shape inference of dynamic computation graphs as a mixture of the above two methods. We present results of applying our prototype inference engine for networks written with PyTorch and demonstrate its effectiveness for nontrivial networks.

Tue 16 Jun

Displayed time zone: Pacific Time (US & Canada) change

11:30 - 12:30
Compilers for Deep Learning FrameworksMAPL at MAPL live stream
Chair(s): Charles Sutton Google Research
11:30
30m
Talk
On the Challenges in Programming Mixed-Precision Deep Neural Networks
MAPL
Ruizhe Zhao Imperial College London, Wayne Luk Imperial College London, Chao Xiong Corerain Technologies, Xinyu Niu Corerain Technologies, Kuen Hung Tsoi Corerain Technologies
12:00
30m
Talk
Semi-static Type, Shape and Symbolic Shape Inference for Dynamic Computation Graphs
MAPL
Momoko Hattori The University of Tokyo, Shimpei Sawada Preferred Networks, Shinichiro Hamaji Preferred Networks, Masahiro Sakai Preferred Networks, Shunsuke Shimizu Preferred Networks