Regularization for Compressive Sensing

semanticscholar(2014)

引用 0|浏览0
暂无评分
摘要
Recently, the design of group sparse regularization has drawn much attentions in group sparse signal recovery problem. Two of the most popular group sparsity inducing regularization are the l1,2 and l1,∞ regularization, defined as the sum of l2 and l∞ norms respectively. Nevertheless, they may fail to simultaneously consider the intra-group and intergroup sparsity characteristic of the signal. For example, Huang and Zhang [14] assert that the l1,2 regularization is superior to the l1 regularization only for strongly group-sparse signals. This means the sparsity of intra-group is useless for l1,2 regularization. Our experiments show that recovering signals with intra-group sparse needs more measurements than those without by the l1,∞ regularization. In this paper, we propose a novel group sparsityinducing regularization defined as a mixture of the l1 norm and the l1/2 quasi-norm, referred to as l1,1/2 regularization, which can overcome the above shortcomings of l1,2 and l1,∞ regularization. We define a new null space property for l1,1/2 regularization and apply it to establish a recoverability theory for the both intra-group and inter-group sparse signals. In addition, we introduce an iteratively reweighted algorithm to solve this model, and analyze its convergence. Comprehensive experiments on simulated data show that our proposed l1,1/2 regularization is superior to l1,2 and l1,∞ regularization, especially for the signals with sparsity level low in inter-group and high in intra-group.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要