Sample Complexity Lower Bounds for Compressive Sensing with Generative Models

2020 International Conference on Signal Processing and Communications (SPCOM)(2020)

引用 1|浏览13
暂无评分
摘要
Recently, it has been shown that for compressive sensing, significantly fewer measurements may be required if the sparsity assumption is replaced by the assumption the unknown vector lies near the range of a suitably-chosen generative model. In particular, in (Bora et al., 2017) it was shown roughly O(k log L) random Gaussian measurements suffice for accurate recovery when the generative model is an L-Lipschitz function with bounded k-dimensional inputs, and O(kd log w) measurements suffice when the generative model is a k-input ReLU network with depth d and width w. In this paper, we establish corresponding algorithm-independent lower bounds on the sample complexity using tools from minimax statistical analysis. In accordance with the above upper bounds, our results are summarized as follows: (i) We construct an L-Lipschitz generative model capable of generating group-sparse signals, and show that the resulting necessary number of measurements is Ω(k log L); (ii) Using similar ideas, we construct ReLU networks with high depth and/or high width for which the necessary number of measurements scales as Ω(kd log w/log n) (with output log n dimension n), and in some cases Ω(kd log w). As a result, we establish that the scaling laws derived in (Bora et al., 2017) are optimal or near-optimal in the absence of further assumptions.
更多
查看译文
关键词
L-Lipschitz generative model,measurements scales,sample complexity lower bounds,compressive sensing,bounded k-dimensional inputs,minimax statistical analysis,group-sparse signals,random Gaussian measurements,k-input ReLU network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要