Talks
Fall 2013
Provably Efficient Algorithms for Training Deep Networks
Wednesday, October 23rd, 2013, 2:15 pm–2:45 pm
We present a provably efficient, layer-by-layer, algorithm for training deep sum-product neural networks. Our algorithm comes with formal polynomial time convergence guarantees. Similar to other deep architectures, the resulting network can compactly represent any function on a finite training set. We also discuss provably efficient algorithms for training other types of neural networks.
Attachment | Size |
---|---|
Provably Efficient Algorithms for Training Deep Networks (slides) | 396.66 KB |