Talks
Fall 2018

Fast Algorithms for Multivariate ODEs arising in Sampling and Learning

Thursday, September 27th, 2018, 2:00 pm2:30 pm

Add to Calendar

Speaker: 

Santosh Vempala (Georgia Institute of Technology)

Sampling logconcave functions arising in statistics and machine learning has been a subject of intensive study. Recent developments include analyses for Langevin dynamics and Hamiltonian Monte Carlo (HMC). While both approaches have dimension-independent bounds for the underlying continuous processes under sufficiently strong smoothness conditions, the resulting discrete algorithms have complexity and number of function evaluations growing with the dimension. Here we give a nearly linear implementation of HMC for sampling from a broad class of densities, with the number of iterations and function evaluations being polylogarithmic in the dimension (rather than polynomial as in previous work). This class includes the widely-used loss function for logistic regression with incoherent weight matrices. Our main contribution is a fast solver for multivariate Ordinary Differential Equations whose solution is close to a low-degree polynomial. We establish logarithmic bounds on the degree for the differential equations arising in HMC. Joint work with Yin Tat Lee and Zhao Song.