On Deterministically Approximating Total Variation Distance

arXiv (Cornell University)(2023)

引用 0|浏览3
暂无评分
摘要
Total variation distance (TV distance) is an important measure for the difference between two distributions. Recently, there has been progress in approximating the TV distance between product distributions: a deterministic algorithm for a restricted class of product distributions (Bhattacharyya, Gayen, Meel, Myrisiotis, Pavan and Vinodchandran 2023) and a randomized algorithm for general product distributions (Feng, Guo, Jerrum and Wang 2023). We give a deterministic fully polynomial-time approximation algorithm (FPTAS) for the TV distance between product distributions. Given two product distributions $\mathbb{P}$ and $\mathbb{Q}$ over $[q]^n$, our algorithm approximates their TV distance with relative error $\varepsilon$ in time $O\bigl( \frac{qn^2}{\varepsilon} \log q \log \frac{n}{\varepsilon \Delta_{\text{TV}}(\mathbb{P},\mathbb{Q}) } \bigr)$. Our algorithm is built around two key concepts: 1) The likelihood ratio as a distribution, which captures sufficient information to compute the TV distance. 2) We introduce a metric between likelihood ratio distributions, called the minimum total variation distance. Our algorithm computes a sparsified likelihood ratio distribution that is close to the original one w.r.t. the new metric. The approximated TV distance can be computed from the sparsified likelihood ratio. Our technique also implies deterministic FPTAS for the TV distance between Markov chains.
更多
查看译文
关键词
variation,distance
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要