Binary Distributed Hypothesis Testing Via Korner-Marton Coding

2016 IEEE Information Theory Workshop (ITW)(2016)

引用 7|浏览2
暂无评分
摘要
We consider the problem of distributed binary hypothesis testing of two sequences that are generated by a doubly binary symmetric source. Each sequence is observed by a different terminal. The two hypotheses correspond to different levels of correlation between the two source components, i.e., the i.i.d. probability of the difference between the two sequences. The terminals communicate with a decision function via equal-rate noiseless links. We analyze the tradeoff between the exponential decay of the error probabilities of the hypothesis test and the communication rate. As Korner-Marton coding is known to minimize the rate in the corresponding distributed compression problem of conveying the difference sequence, it constitutes a natural candidate for the present setting. Indeed, using this scheme we derive achievable error exponents. Interestingly, these coincide with part of the optimal tradeoff without communication constraints, even when the rate is below the Korner-Marton rate for one of the hypotheses.
更多
查看译文
关键词
binary distributed hypothesis testing,Körner-Marton coding,doubly binary symmetric source,error probabilities,distributed compression problem
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要