Integrals with parametric, discontinuous integrands are ubiquitous, and arise via
the discrete latent structure resident in applications like shape optimization, phys-
ical simulation, and graphics. Treating these tasks as inverse problems with an
associated generative process requires optimizing model parameters which inter-
act via these discontinuities. The advent of powerful automatic differentiation
libraries teases the use of generic gradient-based optimization to perform inference
in these models, but this requires computing derivatives of integrals with paramet-
ric discontinuities. A systematic methodology to wield conventional automatic
differentiation frameworks in these discrete contexts is wanting. In this work, we
engage with the difficulty of computing these derivatives by noting a dual between
the geometric/analytic properties of these functions and the probabilistic view of
integration, namely Monte Carlo estimation. We introduce a differentiable variant
of the simple Monte Carlo estimator which facilitates the general computation
of these challenging derivatives, utilizing conventional automatic differentiation
frameworks. We justify the estimator analytically, and demonstrate the generality
of the method as applied to differentiable rendering, implicit shape optimization,
PDE solving, and volumetric reconstruction.
Reading List:
https://drive.google.com/drive/u/1/folders/1VT6kaFKBOnprydP0a_ACaH45IFKcz-ad
Everyone is invited to attend the talk, and those faculty wishing to remain for the oral exam following are welcome to do so.