[Ml-stat-talks] CSML - David Dunson (Nov 4) and Rob Tibshirani (Nov 12)

Storey, John D. jstorey at Princeton.EDU
Wed Oct 15 11:41:01 EDT 2014


Dear all,

I’m very excited to let you all know about the first two seminars in this year’s inaugural seminar series for the Center for Statistics and Machine Learning (which was just launched last July)!  Our first two speakers are David Dunson and Rob Tibshirani.  Both of these professors are COPSS Presidents’ Award winners and are extremely influential researchers.  As you can see from their talk titles below, these should both be fascinating seminars!

Details are below…

Best wishes,
John Storey

David Dunson, Duke University
CS 105 on November 4 at 4:30PM
Title: Scalable Bayes
Abstract:  Bayesian methods have great promise in big data sets, but this promise has not been fully realized due to the lack of scalable and robust algorithms.  Usual sampling approaches bog down as the size of the data and number of parameters increase.  For massive data sets, it has become routine to rely on penalized optimization approaches implemented on distributed computing systems.  However, in scientific applications, it is crucial to obtain a good characterization of uncertainty and not simply a point estimate.  We propose several fundamentally new approaches for scaling up, which we illustrate through large survey data sets, neurosciences and genomics.

Rob Tibshirani, Stanford University
Lewis Library 120 on November 12 at 4:00PM
Title: Post-selection Inference for Forward Stepwise and Least Angle Regression
Abstract:  In this talk I propose new inference tools for least angle and forward stepwise regression. I first present a general scheme for valid inference after any selection event that can be described as the observation vector y falling into a polyhedron.  Following this, I derive a new procedure called the "spacing test" which provides exact conditional tests at any step of the least angle regression (LAR) algorithm, as well as "selection intervals" for the appropriate underlying regression parameters. Remarkably, these tests and intervals account correctly for the adaptive selection performed by LAR. I will apply the same framework to yield selection-adjusted tests and intervals for forward stepwise regression in linear models, generalized linear models and other settings such as the Cox model. Finally I will briefly discuss current work extending these ideas to the PCA setting. Joint work with Jonathan Taylor, Richard Lockhart and Ryan J. Tibshirani.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.cs.princeton.edu/pipermail/ml-stat-talks/attachments/20141015/b47f53a8/attachment.html>


More information about the Ml-stat-talks mailing list