Talk by John Langford next Monday, March 2 -- 1st speaker of the Machine Learning Speaker Series
![](https://secure.gravatar.com/avatar/99e895d681f3892488feea2e538202cf.jpg?s=120&d=mm&r=g)
This semester, the machine learning group will be organizing a speaker series, with the goal of inviting prominent researchers in machine learning to give talks at our department. The series is being sponsored by Yahoo!. Our first speaker is John Langford from Yahoo! Research, who will be visiting the department next Monday (March 2). John is an expert in the theory of machine learning. He also writes a popular blog about machine learning, http://hunch.net. John will be giving a talk at 4:25pm, in CS 105 (small auditorium). The title and abstract for his talk appear below. Future talk announcements for the speaker series will be sent to ML- stats-talk, which is a mailing list for announcing talks related to machine learning and statistics at Princeton. The list can be joined by following this link: https://lists.cs.princeton.edu/mailman/listinfo/ml-stat-talks If you'd like to meet John on Monday, details about his schedule will be sent to ML-stats-talk soon. Title: Use of Hash in Machine Learning Abstract: I'll describe a learning algorithm (Vowpal Wabbit) which effectively learns on and uses terascale datasets with a collection of tricks. One of the central tricks is rather fascinating---it uses a core hash representation which is both extremely time and space efficient. Stated another way, we use a sparsity preserving random projection, for which we prove convergence bounds reminiscent of the JL Lemma or locality sensitive hashing. This trick enables otherwise impossible applications like building a personalized spam filter for hundreds of thousands of users that easily fits into the RAM of a single machine.
participants (1)
-
Melissa Lawson