Inapproximability of the capacity of information stable finite state channels

arXiv: Information Theory(2016)

引用 23|浏览12
暂无评分
摘要
The capacity of a noisy channel is the maximum asymptotic rate at which we can send information over it. For memoryless channels the capacity is given by a simple optimization problem as proven in Shannonu0027s noisy coding theorem. Furthermore, for these channels, we have the Blahut-Arimoto algorithm that efficiently solves the optimization problem and computes the capacity. What can be said about the general situation of channels with memory? In this work we consider one of the simplest families of such channels: the so-called finite state machine channels (FSMC). We show that there cannot exist any algorithm that approximates the capacity of every FSMC to within any desired precision. More concretely, we construct a subfamily S of information stable FSMC with 10 elements in the input alphabet and 62 states so that the capacity of each member is either greater or equal than 1 or less or equal than 1/2 and show that there cannot exist any algorithm that on input an element of S decides which is the case.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要