=== 2/6/24 ORFE Distinguished Lecture Series===
DATE:
Tuesday, February 6, 2024
TIME:
4:30pm
LOCATION: Sherrerd 101
SPEAKER: Yao Xie, Georgia Tech
TITLE: Generative models for statistical inference
Abstract:
We consider the problem of learning a continuous probability density function from data, a fundamental problem in statistics known
as density estimation. It also arises in distributionally robust optimization (DRO), where the goal is to find the worst-case distribution to represent scenario departure from observations. Such a problem is known to be hard in high dimensions and incurs a
significant computational challenge. In this talk, I will present a new approach to tackle these challenges, leveraging recent advances in neural-networks-based generative models, a type of machine learning model that aims to generate new samples following
the underlying distributions of the observations. We develop a new flow-based generative model with guarantees, inspired by the celebrated Jordan-Kinderleherer-Otto (JKO) scheme, to represent the data distribution using a composition of a sequence of optimal
transport maps applying to a source distribution (such as Gaussian). Our method can greatly reduce computational costs when achieving competitive performance over existing generative models. The connection of our JKO-flow method with proximal gradient descent
in the Wasserstein-2 space enables us to prove a density learning guarantee with an exponential convergence rate. Besides density estimation, we also demonstrate that the JKO-flow generative model can be used in various applications, including adversarial
learning, robust hypothesis testing, and data-driven differential privacy.
Bio:Yao Xie is the Coca-Cola
Foundation Chair, Professor at Georgia Institute of Technology in the H. Milton Stewart School of Industrial and Systems Engineering, and Associate Director of the Machine Learning Center. She received her Ph.D. in Electrical Engineering (minor in Mathematics)
from Stanford University in 2012 and was a Research Scientist at Duke University. Her research lies at the intersection of statistics, machine learning, and optimization in providing theoretical guarantees and developing computationally efficient and statistically
powerful methods for problems motivated by real-world applications. She received the National Science Foundation (NSF) CAREER Award in 2017, the INFORMS Wagner Prize Finalist in 2021, and the INFORMS Gaver Early Career Award for Excellence in Operations Research
in 2022. She is currently an Associate Editor for IEEE Transactions on Information Theory, Journal of the American Statistical Association-Theory and Methods, Operations Research, Sequential Analysis: Design Methods and Applications, INFORMS Journal on Data
Science, and an Area Chair of NeurIPS, ICML, and ICLR.