Colloquium- David Duvenaud, Thursday, Feb 18th, 12:30pm

Colloquium Speaker David Duvenaud thursday, Feb 18th, 12:30pm CS 105 Talk title: Composing differentiable procedures for modeling, inference, and optimization Talk abstract: Much recent success in machine learning has been through optimizing simple feedforward procedures, such as neural networks, using gradients. Surprisingly, many complex procedures such as message passing, filtering, inference, and even optimization itself can be meaningfully differentiated though as well. Composing these procedures lets us build sophisticated models that generalize existing methods but retain their good properties. We'll show applications to chemical design, gradient-based tuning of optimization procedures, and training procedures that don't require cross-validation. Bio: David Duvenaud is a postdoc in the Harvard Intelligent Probabilistic Systems group, working with Prof. Ryan Adams on model-based optimization, synthetic chemistry, and neural networks. He did his Ph.D. at the University of Cambridge with Carl Rasmussen and Zoubin Ghahramani. Previous to that, he worked on machine vision both with Kevin Murphy at the University of British Columbia, and later at Google Research. David also co-founded Invenia, an energy forecasting and trading firm Mitra Kelly Academic Secretary Princeton University Computer Science Dept 35 Olden Street Princeton NJ 08540 mkelly@cs.princeton.edu 609-258-4562
participants (1)
-
Mitra Kelly