Events
Fall 2018

Set Theoretic Independence for Statistical Learning

Wednesday, November 28th, 2018, 10:30 am12:00 pm

Add to Calendar

Speaker: 

Shai Ben-David

Location: 

Room 116

We present the first ever set theoretic independence result of a machine learning question. We prove that the learnability of a certain class in Vapnik’s general model of statistical learning is independent of the axioms of set theory. We deduce that there can be no combinatorial dimension that characterizes learnability in that model the way the VC dimension characterizes binary classification learning.

Vapnik’s general model captures many well studied learning problems, such as center-based clustering of distributions, binary and multi-class classification, linear regression and more. The fundamental problem of finding a dimension that captures learnability in this model was first raised, naturally, by Vapnik in his 1998 book and then again by Shamir and Shalev-Shwartz in their seminal 2010 paper on learning beyond uniform convergence.

Along the way towards establishing the independence theorem we derive several new results for that model, including a connection between general learnability and a novel sample compression scheme, and a boosting theorem that may be of independent interest.

This is joint work with Shay Moran, Pavel Hrubes, Amir Shpilka, and Amir Yehudayoff