[Ml-stat-talks] Colloquium speaker: Richard Socher Tues, March 11, 4:30pm

David Blei blei at CS.Princeton.EDU
Fri Mar 7 14:25:53 EST 2014

"you got your NLP in my deep learning!"
"you got your deep learning in my NLP!"

whatever you might think of peanut butter cups, richard socher's
results are making waves.  next tuesday.


---------- Forwarded message ----------
From: Nicole E. Wagenblast <nwagenbl at cs.princeton.edu>
Date: Fri, Mar 7, 2014 at 1:34 PM
Subject: [talks] Colloquium speaker: Richard Socher Tues, March 11, 4:30pm
To: "Talks (colloquium)" <talks at lists.cs.princeton.edu>

Recursive Deep Learning for Modeling Compositional Meaning in Language

Richard Socher (Stanford University)

Tuesday, March 11, 4:30pm

Computer science 105

Great progress has been made in natural language processing thanks to
many different algorithms, each often specific to one application.
Most learning algorithms force language into simplified
representations such as bag-of-words or fixed-sized windows or require
human-designed features. I will introduce three models based on
recursive neural networks that can learn linguistically plausible
representations of language. These methods jointly learn compositional
features and grammatical sentence structure for parsing or phrase
level sentiment predictions. They can also be used to represent the
visual meaning of a sentence which can be used to find images based on
query sentences or to describe images with a more complex description
than single object names.

Besides the state-of-the-art performance, the models capture
interesting phenomena in language such as compositionality. For
instance, people easily see that the "with" phrase in "eating
spaghetti with a spoon" specifies a way of eating whereas in "eating
spaghetti with some pesto" it specifies the dish. I show that my model
solves these prepositional attachment problems well thanks to its
distributed representations. In sentiment analysis, a new tensor-based
recursive model learns different types of high level negation and how
they can change the meaning of longer phrases with many positive
words. They also learn that when contrastive conjunctions such as
"but" are used the sentiment of the phrases following them usually

Richard Socher is a PhD student at Stanford working with Chris Manning
and Andrew Ng. His research interests are machine learning for NLP and
vision. He is interested in developing new deep learning models that
learn useful features, capture compositional structure in multiple
modalities and perform well across different tasks. He was awarded the
2011 Yahoo! Key Scientific Challenges Award, the Distinguished
Application Paper Award at ICML 2011, a Microsoft Research PhD
Fellowship in 2012 and a 2013 "Magic Grant" from the Brown Institute
for Media Innovation.

talks mailing list
talks at lists.cs.princeton.edu
To edit subscription settings or remove yourself, use this link:

More information about the Ml-stat-talks mailing list