Random Projections for Probabilistic Inference
Calvin Lab Auditorium
Probabilistic inference in high-dimensional probabilistic models (i.e., with many variables) is one of the central problems of statistical machine learning and stochastic decision making. To date, only a handful of distinct methods have been developed, most notably (MCMC) sampling, decomposition, and variational methods. In this talk, I will introduce a new approach where random projections are used to simplify a high-dimensional model while preserving some of its key properties. These random projections can be combined with traditional variational inference methods (information projections) and combinatorial optimization tools. These novel randomized approaches provide provable guarantees on the accuracy, and outperform traditional methods in a range of domains.
Attachment | Size |
---|---|
Random Projections for Probabilistic Inference | 2.2 MB |