Optimality Of The Plug-In Estimator For Differential Entropy Estimation Under Gaussian Convolutions

2019 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT)(2019)

引用 7|浏览23
暂无评分
摘要
This paper establishes the optimality of the plugin estimator for the problem of differential entropy estimation under Gaussian convolutions. Specifically, we consider the estimation of the differential entropy h(X + Z), where X and Z are independent d-dimensional random variables with Z similar to N(0, sigma I-2(d)). The distribution of X is unknown and belongs to some nonparametric class, but n independently and identically distributed samples from it are available. We first show that despite the regularizing effect of noise, any good estimator (within an additive gap) for this problem must have an exponential in d sample complexity. We then analyze the absolute-error risk of the plug-in estimator and show that it converges as c(d)/root n, thus attaining the parametric estimation rate. This implies the optimality of the plug-in estimator for the considered problem. We provide numerical results comparing the performance of the plug-in estimator to general-purpose (unstructured) differential entropy estimators (based on kernel density estimation (KDE) or k nearest neighbors (kNN) techniques) applied to samples of X +Z. These results reveal a significant empirical superiority of the plug-in to state-of-the-art KDE- and kNN-based methods.
更多
查看译文
关键词
plug-in estimator,differential entropy estimation,Gaussian convolutions,plugin estimator,good estimator,parametric estimation rate,general-purpose differential entropy estimators,kernel density estimation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要