Colloquium Speaker- Percy Liang Thurs Oct 1, 4:30pm
Colloquium Speaker Percy Liang, Stanford University Thursday, October 1st- 4:30pm Computer Science 105 Learning Hidden Computational Processes We are interested in prediction problems in which evaluating the learned function requires multiple intermediate steps of computation. One motivation is building a system that can answer complex questions: here the function would need to map "How many countries have held the Summer Olympics more than once?" to "3" by applying a sequence of aggregation and filtering operations on a database. In this talk, we examine two key machine learning problems that arise in this setting. First, how do we model the computational process? We argue that the classic paradigm of decoupling modeling from inference is inadequate, and we propose techniques that directly model the inference procedure. Second, learning is very difficult: in our example, the supervision "3" constrains the hidden computational process in a very indirect way. We propose methods that relax the output supervision in a parameterized way, and learn both the relaxation and model parameters jointly subject to an explicit computational constraint. Finally, we show some empirical progress on a new challenging question answering task. Percy Liang is an Assistant Professor of Computer Science at Stanford University (B.S. from MIT, 2004; Ph.D. from UC Berkeley, 2011). His research interests include (i) modeling natural language semantics, (ii) developing machine learning methods that infer rich latent structures from limited supervision, (iii) and studying the tradeoff between statistical and computational efficiency. He is a 2015 Sloan Research Fellow, 2014 Microsoft Research Faculty Fellow, a 2010 Siebel Scholar, and won the best student paper at the International Conference on Machine Learning in 2008.
Correction- this talk will be at 12:30pm.
Nicole Wagenblast
Computer Science Department
Princeton University
35 Olden Street
Princeton, NJ 08540
609-258-4624
----- Original Message -----
From: "Nicole E. Wagenblast"
Colloquium Speaker Percy Liang, Stanford University Thursday, October 1st- 12:30pm Computer Science 105 Learning Hidden Computational Processes We are interested in prediction problems in which evaluating the learned function requires multiple intermediate steps of computation. One motivation is building a system that can answer complex questions: here the function would need to map "How many countries have held the Summer Olympics more than once?" to "3" by applying a sequence of aggregation and filtering operations on a database. In this talk, we examine two key machine learning problems that arise in this setting. First, how do we model the computational process? We argue that the classic paradigm of decoupling modeling from inference is inadequate, and we propose techniques that directly model the inference procedure. Second, learning is very difficult: in our example, the supervision "3" constrains the hidden computational process in a very indirect way. We propose methods that relax the output supervision in a parameterized way, and learn both the relaxation and model parameters jointly subject to an explici t computational constraint. Finally, we show some empirical progress on a new challenging question answering task. Percy Liang is an Assistant Professor of Computer Science at Stanford University (B.S. from MIT, 2004; Ph.D. from UC Berkeley, 2011). His research interests include (i) modeling natural language semantics, (ii) developing machine learning methods that infer rich latent structures from limited supervision, (iii) and studying the tradeoff between statistical and computational efficiency. He is a 2015 Sloan Research Fellow, 2014 Microsoft Research Faculty Fellow, a 2010 Siebel Scholar, and won the best student paper at the International Conference on Machine Learning in 2008.
Colloquium Speaker Percy Liang, Stanford University Thursday, October 1st- 4:30pm Computer Science 105 Learning Hidden Computational Processes We are interested in prediction problems in which evaluating the learned function requires multiple intermediate steps of computation. One motivation is building a system that can answer complex questions: here the function would need to map "How many countries have held the Summer Olympics more than once?" to "3" by applying a sequence of aggregation and filtering operations on a database. In this talk, we examine two key machine learning problems that arise in this setting. First, how do we model the computational process? We argue that the classic paradigm of decoupling modeling from inference is inadequate, and we propose techniques that directly model the inference procedure. Second, learning is very difficult: in our example, the supervision "3" constrains the hidden computational process in a very indirect way. We propose methods that relax the output supervision in a parameterized way, and learn both the relaxation and model parameters jointly subject to an explici t computational constraint. Finally, we show some empirical progress on a new challenging question answering task. Percy Liang is an Assistant Professor of Computer Science at Stanford University (B.S. from MIT, 2004; Ph.D. from UC Berkeley, 2011). His research interests include (i) modeling natural language semantics, (ii) developing machine learning methods that infer rich latent structures from limited supervision, (iii) and studying the tradeoff between statistical and computational efficiency. He is a 2015 Sloan Research Fellow, 2014 Microsoft Research Faculty Fellow, a 2010 Siebel Scholar, and won the best student paper at the International Conference on Machine Learning in 2008.
C olloquium Speaker Percy Liang, Stanford University Thursday, October 1st- 12:30pm Computer Science 105 Learning Hidden Computational Processes We are interested in prediction problems in which evaluating the learned function requires multiple intermediate steps of computation. One motivation is building a system that can answer complex questions: here the function would need to map "How many countries have held the Summer Olympics more than once?" to "3" by applying a sequence of aggregation and filtering operations on a database. In this talk, we examine two key machine learning problems that arise in this setting. First, how do we model the computational process? We argue that the classic paradigm of decoupling modeling from inference is inadequate, and we propose techniques that directly model the inference procedure. Second, learning is very difficult: in our example, the supervision "3" constrains the hidden computational process in a very indirect way. We propose methods that relax the output supervision in a parameterized way, and learn both the relaxation and model parameters jointly subject to an explici t computational constraint. Finally, we show some empirical progress on a new challenging question answering task. Percy Liang is an Assistant Professor of Computer Science at Stanford University (B.S. from MIT, 2004; Ph.D. from UC Berkeley, 2011). His research interests include (i) modeling natural language semantics, (ii) developing machine learning methods that infer rich latent structures from limited supervision, (iii) and studying the tradeoff between statistical and computational efficiency. He is a 2015 Sloan Research Fellow, 2014 Microsoft Research Faculty Fellow, a 2010 Siebel Scholar, and won the best student paper at the International Conference on Machine Learning in 2008.
participants (1)
-
Nicole E. Wagenblast