[Ml-stat-talks] Fwd: Wills seminar Friday March 17, 12:30

Barbara Engelhardt bee at princeton.edu
Thu Mar 16 10:46:11 EDT 2017

Talk of interest.

Time: Friday March 17, 12:30
Place: Sherrerd Hall 101
Addressing Computational and Statistical Gaps with Deep Neural Networks

   - Joan Bruna

   - New York University

*ABSTRACT:* Many modern statistical questions are plagued with asymptotic
regimes that separate our current theoretical understanding with what is
possible given finite computational and sample resources. Two important
examples of such gaps appear in sparse inference and high-dimensional
nonconvex optimisation. In the former, proximal splitting algorithms
efficiently solve the l1-relaxed sparse coding problem, but their
performance is typically evaluated in terms of asymptotic convergence
rates. In the latter, a major challenge is to explain the excellent
empirical performance of stochastic gradient descent when training large
neural networks.
In this talk we will illustrate how Deep architectures can be used in order
to attack such gaps. We will first see how a neural network sparse coding
model (LISTA, Gregor & LeCun’10) can be analyzed in terms of a particular
matrix factorization of the dictionary, which leverages diagonalisation
with invariance of the l1 ball, revealing a phase transition that is
consistent with numerical experiments. We will then discuss the loss
surface of half-rectified neural networks. Despite defining a nonconvex
objective, we will show that by increasing the size of the model, one can
again tradeoff complexity with computation, in the sense that the landscape
becomes asymptotically free of poor local minima.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.cs.princeton.edu/pipermail/ml-stat-talks/attachments/20170316/5580f7a8/attachment.html>

More information about the Ml-stat-talks mailing list