Theory Seminar

Representation power of neural networks

Matus TelgarskyPostdocUniversity of Michigan
SHARE:

This talk will survey a number of classical results regarding the representation power of neural networks, and also provide a new result separating shallow and deep networks: namely, there exist classification problems where a shallow network needs exponentially as many nodes to match the performance of a deep network.

All proofs will be elementary and the talk will require no knowledge of machine learning.

The paper is available at http://arxiv.org/abs/1509.08101

Sponsored by

CSE