Entropy and Source Coding for Integer-Dimensional Singular Random Variables

IEEE Trans. Information Theory(2016)

引用 15|浏览15
暂无评分
摘要
Entropy and differential entropy are important quantities in information theory. A tractable extension to singular random variables—which are neither discrete nor continuous—has not been available so far. Here, we present such an extension for the practically relevant class of integer-dimensional singular random variables. The proposed entropy definition contains the entropy of discrete random variables and the differential entropy of continuous random variables as special cases. We show that it transforms in a natural manner under Lipschitz functions, and that it is invariant under unitary transformations. We define joint entropy and conditional entropy for integer-dimensional singular random variables, and we show that the proposed entropy conveys useful expressions of the mutual information. As first applications of our entropy definition, we present a result on the minimal expected codeword length of quantized integer-dimensional singular sources and a Shannon lower bound for integer-dimensional singular sources.
更多
查看译文
关键词
Entropy,Random variables,Quantization (signal),Mutual information,Rate-distortion,Density functional theory
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要