[talks] FW: Max Welling, May 11, 12:30pm (lunch at 12pm)

Ginny Hogan gch at CS.Princeton.EDU
Mon May 11 10:54:42 EDT 2009

Max Welling from UCI will give a talk on May 11 at 12:30 (lunch served at
12:00) in CS 302. He will also host a discussion group at 4:30pm. See end of
this email for instructions on how to schedule a meeting with Max.

Title: On Herding Dynamical Weights and Fractal Attractors

Abstract: Learning the parameters of a Markov Random Field is intractable.
To circumvent part of this intractability, I propose to give up on the idea
of trying to obtain point estimates. Inspired by the concept of "dynamical
synapses", a dynamical system is introduced that generates sequences of
pseudo-samples that are guaranteed to satisfy the moment constraints of the
associated maximum likelihood problem. This dynamical system is
deterministic, yet non-periodic with Lyaponov exponents all equal to zero,
and its attractor set has fractal properties. I will discuss how to leverage
these ideas for classification and estimation and show experimental results
for fully observed and restricted Boltzman machines. 

Title: On the Role of Smoothing in Topic Models (with Arthur Acuncion and
Padhriac Smyth) Discussion Group @ 4:30PM, Location TBD

Abstract: Latent Dirichlet analysis, or topic modeling, is a flexible latent
variable framework for modeling high-dimensional, sparse count data. Various
learning algorithms have been developed in recent years, including collapsed
Gibbs sampling, variational inference, and maximum a posteriori estimation,
and this variety motivates the need for careful empirical comparisons
between these approaches.  We first highlight the close connections between
these approaches.  We find that the main differences are attributable to the
amount of smoothing applied to the counts.  When the hyperparameters are
optimized, the differences in performance among the algorithms diminish
significantly.  The ability of these algorithms to achieve similarly
accurate solutions gives us the freedom to select computationally efficient
approaches.  On text corpora with thousands of documents, accurate topic
models can be learned in several seconds, using the insights gained from
this comparative study.

If you'd like to meet Max on Monday, please:
1. Visit http://wass.princeton.edu.
2. Sign in using your OIT login.
3. Click on "Make an appointment."
4. Enter "mlvisit" where it says "Calendar owner's netid."
5. Reserve a spot on May 11 under "Meeting with Max Welling."

More information about the talks mailing list