谷歌浏览器插件
订阅小程序
在清言上使用

Support Size Estimation: The Power of Conditioning

MFCS(2022)

引用 0|浏览10
暂无评分
摘要
We consider the problem of estimating the support size of a distribution $D$. Our investigations are pursued through the lens of distribution testing and seek to understand the power of conditional sampling (denoted as COND), wherein one is allowed to query the given distribution conditioned on an arbitrary subset $S$. The primary contribution of this work is to introduce a new approach to lower bounds for the COND model that relies on using powerful tools from information theory and communication complexity. Our approach allows us to obtain surprisingly strong lower bounds for the COND model and its extensions. 1) We bridge the longstanding gap between the upper ($O(\log \log n + \frac{1}{\epsilon^2})$) and the lower bound $\Omega(\sqrt{\log \log n})$ for COND model by providing a nearly matching lower bound. Surprisingly, we show that even if we get to know the actual probabilities along with COND samples, still $\Omega(\log \log n + \frac{1}{\epsilon^2 \log (1/\epsilon)})$ queries are necessary. 2) We obtain the first non-trivial lower bound for COND equipped with an additional oracle that reveals the conditional probabilities of the samples (to the best of our knowledge, this subsumes all of the models previously studied): in particular, we demonstrate that $\Omega(\log \log \log n + \frac{1}{\epsilon^2 \log (1/\epsilon)})$ queries are necessary.
更多
查看译文
关键词
estimation,conditioning,support,size
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要