[talks] O Weinstein general exam

Melissa M. Lawson mml at CS.Princeton.EDU
Fri May 18 09:25:34 EDT 2012


Omri Weinstein will present his research seminar/general exam on 
Wednesday May 23 at 2PM in Room 402.  The members of his committee 
are:  Mark Braverman (advisor), Zeev Dvir, Sanjeev Arora.  Everyone 
is invited to attend his talk and those faculty wishing to remain for 
the oral exam following are welcome to do so.  His abstract and 
reading list follow below.
--------------------------------
Abstract:

In the late 1940's Shannon developed his information theory in order to understand the one way data transmission problem.
His work revealed the tight connection between communication and information, namely, that a the description length of a 
one-way message is essentially equivalent to the amount of information it contains. 
The goal of our ongoing research is to extend this theory, develop the right tools, and 
understand how information behaves in *interactive* settings, where two (or more) parties must engage in an interactive 
conversation in order to accomplish some desirable task.

Motivated by applications in communication complexity, our main quantity of interest is the Information Complexity of a problem, 
which informally measures the average amount of information the players need to reveal each other about their inputs in order 
to solve the problem.

We provide the first general technique for proving information lower bounds on two-party communication problems,
by showing that the discrepancy lower bound (which applies to randomized communication complexity) also applies to information 
complexity. More precisely, if the discrepancy of a two-party function $f$ with respect to a distribution $\mu$ is 
$Disc_\mu f$, then any two party randomized protocol computing $f$ must reveal at least 
$\Omega(\log (1/Disc_\mu f))$ bits of information to the participants. As a corollary, we obtain that 
any two-party protocol for computing a random function on $\{0,1\}^n\times\{0,1\}^n$ must reveal 
$\Omega(n)$ bits of information to the participants. The proof of our main result develops a new simulation procedure 
that may be of an independent interest. 

In addition, we prove that the discrepancy of the Greater-Than function is $\Omega(1/\sqrt{n})$, which provides a simplified 
proof of the $\Omega(\log n)$ lower bound 
on the communication complexity  of this well-studied function and, combined with our main result, proves the tight 
$\Omega(\log n)$ lower bound on its information complexity.

I will try to cover some of our recent results: An optimal zero-error information-cost protocol for the notable AND function,
which in turn enables us to compute the *exact* (0.48n) communication complexity of the well studied Disjointness function. 
Our results contribute to the characterization and understanding of several properties of the information complexity function, 
and in particular leads to the computability of this measure.

The first result is a joint work with Mark Braverman, and the second, with Mark Braverman, Denis Pankratov and Ankit Garg.

READING LIST:

1) Information = Amortized Communication [BR].

2) How to Compress Interactive Communication [BBCR].

3) Interactive Communication Complexity [Bra].

4) Strong Direct Product [JPY].

5) The communication complexity of Gap Hamming Distance [Shrestov].

6) The communication complexity of Correlation [HJMR].


BOOK: 

Computational Complexity - A modern approach [Arora Barak].







More information about the talks mailing list