Density estimation for shift-invariant multidimensional distributions.

ITCS(2019)

引用 23|浏览39
暂无评分
摘要
We study density estimation for classes of distributions over $mathbb{R}^d$. A multidimensional distribution is shift-invariant if, roughly speaking, it is close in total variation distance to a small shift of it in any direction. Shift-invariance relaxes smoothness assumptions commonly used in non-parametric density estimation to allow jump discontinuities. The different classes of distributions that we consider correspond to different rates of tail decay. For each such class we give an efficient algorithm that learns any distribution in the class from independent samples with respect to total variation distance. As a special case of our general result, we show that $d$-dimensional distributions which satisfy an exponential tail bound can be learned to total variation distance error $e$ using $tilde{O}_d(1/ e^{d+2})$ examples and $tilde{O}_d(1/ e^{2d+2})$ time. This implies that, for constant $d$, multivariate log-concave distributions can be learned in $tilde{O}_d(1/e^{2d+2})$ time using $tilde{O}_d(1/e^{d+2})$ samples, answering a question of [Diakonikolas, Kane and Stewart, 2016] All of our results extend to a model of noise-tolerant density estimation using Huberu0027s contamination model, in which the target distribution to be learned is a $(1-e,e)$ mixture of some unknown distribution in the class with some other arbitrary and unknown distribution, and the learning algorithm must output a hypothesis distribution with total variation distance error $O(e)$ from the target distribution. We show that our general results are close to best possible by proving a simple $Ωleft(1/e^dright)$ information-theoretic lower bound on sample complexity even for learning bounded distributions that are shift-invariant.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要