Mixtures of Gaussians are Privately Learnable with a Polynomial Number of Samples

International Conference on Algorithmic Learning Theory(2023)

引用 0|浏览2
暂无评分
摘要
We study the problem of estimating mixtures of Gaussians under the constraint of differential privacy (DP). Our main result is that poly(k,d,1/α,1/ε,log(1/δ)) samples are sufficient to estimate a mixture of k Gaussians in ℝ^d up to total variation distance α while satisfying (ε, δ)-DP. This is the first finite sample complexity upper bound for the problem that does not make any structural assumptions on the GMMs. To solve the problem, we devise a new framework which may be useful for other tasks. On a high level, we show that if a class of distributions (such as Gaussians) is (1) list decodable and (2) admits a "locally small” cover (Bun et al., 2021) with respect to total variation distance, then the class of its mixtures is privately learnable. The proof circumvents a known barrier indicating that, unlike Gaussians, GMMs do not admit a locally small cover (Aden-Ali et al., 2021b).
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要