Inference Under Information Constraints I: Lower Bounds From Chi-Square Contraction

IEEE Transactions on Information Theory(2020)

引用 77|浏览477
暂无评分
摘要
Multiple players are each given one independent sample, about which they can only provide limited information to a central referee. Each player is allowed to describe its observed sample to the referee using a channel from a family of channels W, which can be instantiated to capture, among others, both the communication and privacy-constrained settings. The referee uses the players' messages to solve an inference problem on the unknown distribution that generated the samples. We derive lower bounds for the sample complexity of learning and testing discrete distributions in this informationconstrained setting. Underlying our bounds is a characterization of the contraction in chi-square distance between the observed distributions of the samples when information constraints are placed. This contraction is captured in a local neighborhood in terms of chi-square and decoupled chi-square fluctuations of a given channel, two quantities we introduce. The former captures the average distance between distributions of channel output for two product distributions on the input, and the latter for a product distribution and a mixture of product distribution on the input. Our bounds are tight for both publicand privatecoin protocols. Interestingly, the sample complexity of testing is order-wise higher when restricted to private-coin protocols.
更多
查看译文
关键词
Distributed algorithms,inference algorithms,statistical analysis,minimax techniques,parameter estimation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要