CS Colloquium Speaker

Dougal Maclaurin from Google Brain

Tuesday, October 15 - 12:30pm

Computer Science - Room 105

Host: Ryan Adams

https://www.cs.princeton.edu/events/25877

 

JAX: Accelerated machine learning research via composable function transformations in Python

 

JAX is a system for high-performance machine learning research. It offers the familiarity of Python+NumPy and the speed of hardware accelerators, and it enables the definition and the composition of function transformations useful for machine learning programs. In particular, these transformations include automatic differentiation, automatic batching, end-to-end-compilation (via XLA), and parallelizing over multiple accelerators. They are the key to JAX's power and to its relative simplicity.

 

JAX had its initial open-source release in December 2018 (https://github.com/google/jax). It is currently being used by several groups of researchers for a wide range of advanced applications, from studying spectra of neural networks, to probabilistic programming and Monte Carlo methods, and scientific applications in physics and biology. Users appreciate JAX most of all for its ease of use and flexibility.

 

Bio: Dougal Maclaurin is a research scientist at Google. He works on programming languages and systems for machine learning, particularly the Python library JAX. He started Autograd, a system for automatic differentiation in Python, which has inspired the design of several systems, including PyTorch, MinPy, Torch Autograd and Julia Autograd. He is a co-founder of Day Zero Diagnostics, a startup developing a sequencing-based diagnostic for drug-resistant infections. He received his Ph.D. from Harvard in 2016, working with Ryan Adams on the development of methods for machine learning. His work on scalable MCMC, "Firefly Monte Carlo", was recognized with the Best Paper award at UAI 2014.