Lecture 1: Information Theory in Combinatorics I
Lecture 2: Information Theory in Combinatorics II
This series of talks is part of the Information Theory Boot Camp. Videos for each talk will be available through the links above.
Speaker: Shachar Lovett, UC San Diego
We will begin with a quick review of information theoretic notions such as entropy, conditional entropy, mutual information and information divergence. Based on these we will give information theoretic proofs of various classical results: Bregman's theorem, counting antichains, the Fredman-Komlos bound, Shearer's Lemma and its applications. Then, we will go on to see how information theoretic ideas can be used to establish various inequalities.