[Ml-stat-talks] Fwd: [talks] Colloquium Speaker: Manfred K. Warmuth, Today- 12:30pm

Barbara Engelhardt bee at princeton.edu
Mon May 22 09:01:55 EDT 2017


Talk of interest today.

---------- Forwarded message ----------

Colloquium Speaker
Manfred K Warmuth, University of California, Santa Cruz
Monday, May 22, 12:30pm
Computer Science 105

The blessing and the curse of the multiplicative updates - discusses
connections between in evolution and the multiplicative updates of online
learning


Multiplicative updates multiply the parameters by nonnegative factors.
These updates are motivated by a Maximum Entropy Principle and  they are
prevalent in evolutionary processes where the parameters  are for example
concentrations of species and the factors are survival rates. The simplest
such update is Bayes rule and we give an in vitro selection algorithm for
RNA strands that implements this rule in the test tube where each RNA
strand represents a different model.  In one liter of the RNA soup there
are approximately 10^15 different strands and therefore this is a rather
high-dimensional implementation of Bayes rule.

We investigate multiplicative updates for the purpose of learning online
while processing a stream of examples. The ``blessing'' of these updates is
that they learn very fast in the short term because the good parameters
grow exponentially. However their ``curse'' is that they learn too fast and
 wipe out parameters too quickly. This can have a negative effect in the
long term. We describe a number of methods developed in the realm of online
learning that ameliorate the curse of the multiplicative updates. The
methods make the algorithm robust against data that changes over time and
prevent the currently good parameters from taking over. We also discuss how
the curse is circumvented by nature. Surprisingly, some of nature's methods
parallel the ones developed in Machine Learning, but nature also has some
additional tricks.

This will be a high level talk.
No background in online learning will be required.
We will give a number of open problems and discuss how these updates are
applied for training feed forward neural nets.

_______________________________________________
talks mailing list
talks at lists.cs.princeton.edu
To edit subscription settings or remove yourself, use this link:
https://lists.cs.princeton.edu/mailman/listinfo/talks
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.cs.princeton.edu/pipermail/ml-stat-talks/attachments/20170522/7ae00351/attachment.html>


More information about the Ml-stat-talks mailing list