Pointwise lossy source coding theorem for sources with memory

Information Theory Proceedings(2012)

引用 2|浏览62
暂无评分
摘要
We investigate the minimum pointwise redundancy of variable length lossy source codes operating at fixed distortion for sources with memory. The redundancy is defined by ln(X1n) - nR(D), where ln(X1n) is the code length at block size n and R(D) is the rate distortion function. We restrict ourselves to the case where R(D) can be calculated, namely the cases where the Shannon lower bound to R(D) holds with equality. In this case, for balanced distortion measures, we provide a pointwise lower bound to the code length sequence in terms of the entropy density process. We show that the minimum coding variance with distortion is lower bounded by the minimum lossless coding variance, and is non-zero unless the entropy density is deterministic. We also examine lossy coding in the presence of long range dependence, showing the existence of information sources for which long range dependence persists under any codec operating at the Shannon lower bound with fixed distortion.
更多
查看译文
关键词
entropy codes,rate distortion theory,source coding,Shannon lower bound,balanced distortion measures,block size,code length sequence,codec,entropy density process,information sources,minimum lossless coding variance,minimum pointwise redundancy,pointwise lossy source coding theorem,pointwise lower bound,rate distortion function,variable length lossy source codes
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要