[Ml-stat-talks] ben taskar (upenn) speaks tomorrow at noon

David Blei blei at CS.Princeton.EDU
Mon Mar 21 09:46:47 EDT 2011


hi ml-stat-talks,

a reminder that tomorrow at noon in cs302 is ben taskar from upenn.  see below.

ben is a top machine learning researcher who has made important
contributions in many areas (and he's practically local).  don't miss
it!

best
dave

---------- Forwarded message ----------
From: David Mimno <mimno at cs.princeton.edu>
Date: Fri, Mar 18, 2011 at 9:13 AM
Subject: [Ml-stat-talks] Ben Taskar on determinantal point processes
To: ml-stat-talks <ml-stat-talks at lists.cs.princeton.edu>



When you look at truly random points, like the stars or flecks of dust
on a monitor, it's not unusual to see points almost touching. But if
you ask people to draw "random points", they'll probably have a slight
bias against putting two dots right next to each other. What
statistical distributions describe this kind of not-quite-randomness?
For our next Machine Learning lecture, Ben Taskar from Penn will
describe just such a model, and its surprisingly wide applications.
[-DM]

Tues, Mar 22, 12:00. CS 302. Lunch!

======================================

Ben Taskar, University of Pennsylvania

Determinantal Point Processes: Representation, Inference and Learning

Determinantal point processes (DPPs) arise in random matrix theory
and quantum physics as models of random variables with negative
correlations. Among many remarkable properties, they offer tractable
algorithms for exact inference, including computing marginals,
computing certain conditional probabilities, and sampling.   DPPs
are a natural model for subset selection problems where diversity is
preferred.  For example, they can be used to select diverse sets of
sentences to form document summaries, or to return relevant but
varied text and image search results, or to detect non-overlapping
multiple object trajectories in video.   I'll present our recent work on
a novel factorization and dual representation of DPPs that enables
efficient inference for exponentially-sized structured sets. We develop
a new inference algorithm based on Newton identities for DPPs
conditioned on subset size. We also derive efficient parameter
estimation for DPPs from several types of observations.  I'll show
the advantages of the model on several natural language and vision
tasks: extractive document summarization, diversifying image search
results and multi-person articulated pose estimation problems in images.

Joint work with Alex Kulesza, University of Pennsylvania

Bio:
Ben Taskar received his bachelor's and doctoral degree in Computer
Science from Stanford University. After a postdoc at the University of
California at Berkeley, he joined the faculty at the University of
Pennsylvania Computer and Information Science Department in 2007,
where he currently co-directs PRiML: Penn Research in Machine
Learning. His research interests include machine learning, natural
language processing and computer vision. He has been awarded the
Sloan Research Fellowship and selected for the Young Investigator
Program by the Office of Naval Research and the DARPA Computer
Science Study Group. His work on structured prediction has received
best paper awards at NIPS and EMNLP conferences.
_______________________________________________
Ml-stat-talks mailing list
Ml-stat-talks at lists.cs.princeton.edu
https://lists.cs.princeton.edu/mailman/listinfo/ml-stat-talks


More information about the Ml-stat-talks mailing list