Quantization Of Random Distributions Under Kl Divergence

2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT)(2021)

引用 4|浏览0
暂无评分
摘要
Consider the problem of representing a distribution pi on a large alphabet of size k up to fidelity epsilon in Kullback-Leibler (KL) divergence. Heuristically, arguing as for quadratic loss in high dimension, one expects that about (k/2) log (1/epsilon) bits would be required. We show this intuition is correct by proving explicit non-asymptotic bounds for the minimal average distortion when pi is randomly sampled from a symmetric Dirichlet prior on the simplex. Our method is to reduce the single-sample problem to the traditional setting of iid samples, but for a non-standard rate distortion question with the novel distortion measure d(x, y) = x log(x/y), which we call divergence distortion. Practically, our results advocate using a x bar right arrow x(2/3) compander (for small x) followed by a uniform scalar quantizer for storing large-alphabet distributions.
更多
查看译文
关键词
Compression, rate distortion, quantization, Kullback-Leibler divergence, Shannon Lower Bound
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要