Talks
Fall 2020

Learning and Testing for Gradient Descent
Tuesday, December 15th, 2020, 9:00 am–9:30 am
Speaker:
Emmanuel Abbe (EPFL)
Location:
Zoom
We present lower-bounds for the generalization error of gradient descent on free initializations, reducing the problem to testing the algorithm’s output under different data models. We then discuss lower-bounds on random initialization and present the problem of learning communities in the pruned-block-model, where it is conjectured that GD fails.