Information Decomposition Diagrams Applied beyond Shannon Entropy: A Generalization of Hu's Theorem
CoRR(2022)
摘要
In information theory, one major goal is to find useful functions that
summarize the amount of information contained in the interaction of several
random variables. Specifically, one can ask how the classical Shannon entropy,
mutual information, and higher interaction information relate to each other.
This is answered by Hu's theorem, which is widely known in the form of
information diagrams: it relates shapes in a Venn diagram to information
functions, thus establishing a bridge from set theory to information theory. In
this work, we view random variables together with the joint operation as a
monoid that acts by conditioning on information functions, and entropy as a
function satisfying the chain rule of information. This abstract viewpoint
allows to prove a generalization of Hu's theorem. It applies to Shannon and
Tsallis entropy, (Tsallis) Kullback-Leibler Divergence, cross-entropy,
Kolmogorov complexity, submodular information functions, and the generalization
error in machine learning. Our result implies for Chaitin's Kolmogorov
complexity that the interaction complexities of all degrees are in expectation
close to Shannon interaction information. For well-behaved probability
distributions on increasing sequence lengths, this shows that the per-bit
expected interaction complexity and information asymptotically coincide, thus
showing a strong bridge between algorithmic and classical information theory.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要