Alex Upjohn Beatson will present his Pre-FPO "Learned surrogates and stochastic gradients for accelerating engineering modeling, simulation, and design" on Monday, February 8, 2021 at 11am via Zoom.

Zoom Link: https://princeton.zoom.us/j/99264611511

The members of his committee are as follows:
Readers - Szymon Rusinkiewicz, Sigrid Adriaenssens (Civil and Environmental Engineering)
Examiners - Ryan Adams (advisor), Elad Hazan, Mark Girolami (Cambridge University)

Title: Learned surrogates and stochastic gradients for accelerating engineering modeling, simulation, and design

Abstract:
Numerical methods, such as discretization-based methods for solving ODEs and PDEs, allow us to model and design complex devices, structures, and systems. However, this is often very costly in terms of both computation and the time of the expert who must specify the physical model, the discretization, the solver, etc. In this talk I will present a line of work using deep learning and stochastic gradient estimation to speed up solution and optimization of systems characterized by such numerical methods. 
First, I will present composable energy surrogates, in which neural surrogates are trained to model a potential energy in sub-components or sub-domains of a PDE, and then composed or stitched together to solve a larger system by minimizing the sum of potentials across components. This allows surrogate modeling without costly ground-truth simulation of the full system, as training data are generated by performing finite element analysis with individual components. We show that these surrogates can accelerate simulation of parametric meta-materials and produce accurate macroscopic behavior when composed. Next, I will discuss randomized telescoping gradient estimators, which provide unbiased gradient estimators for objectives which are the limit of a sequence of increasingly accurate, increasingly costly approximations. These estimators represent the limit as a telescoping sum and sample linear combinations of terms to provide cheap unbiased estimates. We discuss conditions which permit finite variance and computation, optimality of certain estimators within this class, and application to problems in numerical modeling and machine learning. Lastly, I will discuss how these lines of work can be extended to reduce not just the computational burden but also the burden on the engineer carrying out modeling and design.