Events Fall 2020

Richard M. Karp Distinguished Lecture — Insights on Gradient-Based Algorithms in High-Dimensional Learning

Monday, September 14th, 2020, 11:00 am12:00 pm

Add to Calendar

Speaker: 
Location: 

This talk will be held virtually and will be live streamed on our website. Full participation (including the capacity to ask questions) will be available via Zoom webinar. Zoom link: https://berkeley.zoom.us/j/97166437509.

Gradient descent algorithms and their noisy variants, such as the Langevin dynamics or multipass stochastic gradient descent, are at the center of attention in machine learning. Yet their behavior remains perplexing, in particular in the high-dimensional nonconvex setting. In this talk, I will present several high-dimensional and (mostly) nonconvex statistical learning problems in which the performance of gradient-based algorithms can be analyzed down to a constant. The common point of these settings is that the data come from a probabilistic generative model leading to problems for which, in the high-dimensional limit, statistical physics provides exact closed solutions for the performance of the gradient-based algorithms. The covered settings include the spiked mixed matrix-tensor model, the perceptron, and phase retrieval.

If you require accommodation for communication, please contact our Access Coordinator at simonsevents [at] berkeley.edu with as much advance notice as possible.