A Discrepancy Lower Bound for Information Complexity

Algorithmica(2015)

引用 21|浏览57
暂无评分
摘要
This paper provides the first general technique for proving information lower bounds on two-party unbounded-rounds communication problems. We show that the discrepancy lower bound, which applies to randomized communication complexity, also applies to information complexity. More precisely, if the discrepancy of a two-party function f with respect to a distribution μ is Disc_μ f , then any two party randomized protocol computing f must reveal at least Ω (log (1/Disc_μ f)) bits of information to the participants. As a corollary, we obtain that any two-party protocol for computing a random function on {0,1}^n×{0,1}^n must reveal Ω (n) bits of information to the participants. In addition, we prove that the discrepancy of the Greater-Than function is Ω (1/√(n)) , which provides an alternative proof to the recent proof of Viola (Proceedings of the twenty-fourth annual ACM-SIAM symposium on discrete algorithms, SODA 2013, New Orleans, LA, USA, 6–8 Jan 2013, pp 632–651, 2013 ) of the Ω (log n) lower bound on the communication complexity of this well-studied function and, combined with our main result, proves the tight Ω (log n) lower bound on its information complexity. The proof of our main result develops a new simulation procedure that may be of an independent interest. In a followup breakthrough work of Kerenidis et al. (53rd annual IEEE symposium on foundations of computer science, FOCS 2012, New Brunswick, NJ, USA, 20–23 Oct 2012, pp 500–509, 2012 ), our simulation procedure served as a building block towards a proof that almost all known lower bound techniques for communication complexity (and not just discrepancy) apply to information complexity as well.
更多
查看译文
关键词
Success Probability,Communication Complexity,Information Cost,Information Complexity,Compression Result
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要