Talks

Universality of Neural Networks

Tuesday, December 7th, 2021, 11:35 am11:50 am

Add to Calendar

Speaker: 
Location: 

Calvin Lab Auditorium

It is well known that, at a random initialization, as their width approaches infinity, neural networks can be well approximated by Gaussian processes. We quantify this phenomenon by providing non-asymptotic convergence rates in the space of continuous functions. In the process, we study the Central Limit Theorem in high and infinite dimensions, as well as anti-concentration properties of polynomials with random variables

AttachmentSize
PDF icon neuraluniversality.pdf226.82 KB