Write a Blog >>
PLDI 2020
Mon 15 - Fri 19 June 2020
Tue 16 Jun 2020 13:00 - 14:00 at MAPL live stream - Keynote Talk Chair(s): Justin Gottschlich

Training deep neural networks (DNNs) can be expensive and slow, consuming enormous numbers of compute-hours on parallel machines. This talk will present results on using novel search procedures over programs to reduce training time. In particular, instead of greedily applying program-improving transformations to compute a single improved program, we search a space of programs, considering many possible candidates guided by a global cost function. Application of search-based optimization to two separate problems will be discussed: improving the partitioning and distribution of training data, and reducing the execution time of the DNN computation graph. Both methods speedup training by up to a factor of 3 over current state-of-the-art systems.

Alex Aiken is the Alcatel-Lucent Professor of Computer Science at Stanford. Alex received his Bachelors degree in Computer Science and Music from Bowling Green State University in 1983 and his Ph.D. from Cornell University in 1988. Alex was a Research Staff Member at the IBM Almaden Research Center (1988-1993) and a Professor in the EECS department at UC Berkeley (1993-2003) before joining the Stanford faculty in 2003. His research interest is in areas related to programming languages. He is an ACM Fellow, a recipient of ACM SIGPLAN’s Programming Languages Achievement Award and Phi Beta Kappa’s Teaching Award, and a former chair of the Stanford Computer Science Department.

Tue 16 Jun

Displayed time zone: Pacific Time (US & Canada) change

13:00 - 14:00
Keynote TalkMAPL at MAPL live stream
Chair(s): Justin Gottschlich Intel Labs / Penn
13:00
60m
Talk
Program Optimization for Machine Learning
MAPL
Alex Aiken Stanford University, USA