[talks] Colloquium Speaker Erik Sudderth, Wed Nov 24, 4:30pm

Nicole E. Wagenblast nwagenbl at CS.Princeton.EDU
Fri Nov 14 11:00:00 EST 2014

Colloquium Speaker 
Erik Sudderth, Brown University 
Wednesday, November 24, 2014 - 4:30pm 
Computer Science 105 

Flexible, Reliable, and Scalable Nonparametric Learning 

Applications of statistical machine learning increasingly involve datasets with rich hierarchical, temporal, spatial, or relational structure. Bayesian nonparametric models offer the promise of effective learning from big datasets, but standard inference algorithms often fail in subtle and hard-to-diagnose ways. We explore this issue via variants of a popular and general model family, the hierarchical Dirichlet process. We propose a framework for "memoized" online optimization of variational learning objectives, which achieves computational scalability by processing local batches of data, while simultaneously adapting the global model structure in a coherent fashion. Using this approach, we build improved models of text, audio, image, and social network data. 

Erik B. Sudderth is an Assistant Professor in the Brown University Department of Computer Science. He received the Bachelor's degree (summa cum laude, 1999) in Electrical Engineering from the University of California, San Diego, and the Master's and Ph.D. degrees (2006) in EECS from the Massachusetts Institute of Technology. His research interests include probabilistic graphical models; nonparametric Bayesian methods; and applications of statistical machine learning in computer vision and the sciences. He received an NSF CAREER award in 2014, and in 2008 was named one of "AI's 10 to Watch" by IEEE Intelligent Systems Magazine. 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.cs.princeton.edu/pipermail/talks/attachments/20141114/64533fa7/attachment.html>

More information about the talks mailing list