Events
Spring 2017
ML Thursday Seminar
Thursday, April 13th, 2017, 10:00 am–12:00 pm
Parent Program:
Speaker:
Location:
Calvin Lab Room 116
Deep Submodular Functions
We start by covering how submodularity is useful in machine learning and data science (e.g., summarization, diversity modeling, tree-width unrestricted probabilistic models). We then show that while certain submodular functions (e.g., sums of concave composed with modular functions (SCMMs)) are useful due to their practicality, scalability, and versatility, they are limited in their ability to model higher level interaction. We thus define a new class of submodular functions, deep submodular functions (DSFs), and show that DSFs constitute a strictly larger class of submodular functions than SCMMs, but that they share all of the SCMM advantages. DSFs are a parametric family of submodular functions that share many of the properties of deep neural networks (DNNs), including many-layered hierarchical topologies, distributed representations, opportunities and strategies for training, and suitability to GPU-based matrix/vector computing. We show that for any integer $k>0$, there are $k$-layer DSFs that cannot be represented by a $k'$-layer DSF for any $k'<k$. This implies that, like DNNs, there is a utility to depth, but unlike DNNs (which can be universally approximated by shallow networks), the family of DSFs strictly increase with depth. We show that DSFs, however, even with arbitrarily large $k$, do not comprise all submodular functions. For this talk, we will assume no prior background in submodularity and the talk will be self-contained. This is joint work with Wenruo Bai.