An Information Theoretic View of Hypercontractivity
Calvin Lab Auditorium
The Beckner - Bonami inequality was first imported to the scene of theoretical computer science and analytical combinatorics in the famou KKL (Kahn-Kalai-Linial) paper. Since, it has become a cornerstone of discrete Fourier analysis, and has had great influence (no pun intended) and numerous and diverse applications. Roughly, the inequality compares one norm of a real function f on {0,1}^n with a different norm of Tf, where T is an averaging operator. Not surprisingly, the averaging produces a "smoother" function, and this is captured by the comparison.
A slightly different interpretation is that the inequality bounds the correlation between a vector chosen from a given distribution and a noisy version of it. In this talk we will prove such a version of the inequality using an information theoretic approach.
Previous information theoretic proofs of the dual version of the inequality were given by Friedgut-Rodl, and Blais-Tan, but it seems that the current proof is unrelated.
Attachment | Size |
---|---|
entropybeckner.pdf | 2.32 MB |