The Lossy Common Information of Correlated Sources

arXiv (Cornell University)(2014)

引用 46|浏览11
暂无评分
摘要
The two most prevalent notions of common information (CI) are due to Wyner and Gacs-Korner and both the notions can be stated as two different characteristic points in the lossless Gray-Wyner region. Although the information theoretic characterizations for these two CI quantities can be easily evaluated for random variables with infinite entropy (eg., continuous random variables), their operational significance is applicable only to the lossless framework. The primary objective of this paper is to generalize these two CI notions to the lossy Gray-Wyner network, which hence extends the theoretical foundation to general sources and distortion measures. We begin by deriving a single letter characterization for the lossy generalization of Wyner's CI, defined as the minimum rate on the shared branch of the Gray-Wyner network, maintaining minimum sum transmit rate when the two decoders reconstruct the sources subject to individual distortion constraints. To demonstrate its use, we compute the CI of bivariate Gaussian random variables for the entire regime of distortions. We then similarly generalize Gacs and Korner's definition to the lossy framework. The latter half of the paper focuses on studying the tradeoff between the total transmit rate and receive rate in the Gray-Wyner network. We show that this tradeoff yields a contour of points on the surface of the Gray-Wyner region, which passes through both the Wyner and Gacs-Korner operating points, and thereby provides a unified framework to understand the different notions of CI. We further show that this tradeoff generalizes the two notions of CI to the excess sum transmit rate and receive rate regimes, respectively.
更多
查看译文
关键词
Common information,Gray-Wyner network,multiterminal source coding
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要