[talks] reminder: Tomorrow's CS Colloquium (Feb 11): Stuart Geman]

Fei-Fei Li feifeili at CS.Princeton.EDU
Tue Feb 10 17:03:13 EST 2009

-------- Original Message --------
Subject: [Ml-stat-talks] CS Colloquium (Feb 11): Stuart Geman
Date: Sun, 01 Feb 2009 17:08:43 -0500
From: Fei-Fei Li <feifeili at CS.Princeton.EDU>
CC: csugrad at CS.Princeton.EDU, ml-stat-talks at lists.cs.princeton.edu, 
    csgrad at CS.Princeton.EDU, csfac at CS.Princeton.EDU, 
pixl-talks at lists.cs.princeton.edu
References: <8697FFE6-7521-4BB6-A8E8-7C1A55376400 at cs.princeton.edu>

Dear all,

Stuart Geman, James Manning Professor of Applied Mathematics from Brown
University, will give a CS Colloquium talk at 4:15pm Wednesday Feb 11.
Prof. Geman has been a leader in statistics and computer vision. His
recent projects span from computer vision to neuroscience to financial
markets. Below please find his talk title and abstracts.


Google and the Vapnik-Chervonenkis Dimension

Stuart Geman
Brown University

Google engineers routinely train query classifiers, for ranking
advertisements or search results, on more words than any human being
sees or hears in a lifetime. A human being who sees a meaningfully new
image every second for one-hundred years will not see as many images as
Google has in its libraries, all of which are available for training
object detectors and image classifiers. Yet by human standards the
state-of-the-art, in computer understanding of language and
computer-generated image analysis, is primitive. What explains the gap?
Why can’t learning theory tell us how to make machines that learn as
efficiently as humans? Upper bounds on the number of training samples
needed to learn a classifier as rich and competent as the human visual
system can be derived using the Vapnik-Chervonenkis dimension, or the
metric entropy, but these suggest that not only does Google need more
examples, but all of evolution might fall short. I will make some
proposals for efficient learning and offer some mathematics to support

More information about the talks mailing list