An Efficient Sparse Bayesian Learning Algorithm Based on Gaussian-Scale Mixtures

IEEE Transactions on Neural Networks and Learning Systems(2022)

引用 20|浏览19
暂无评分
摘要
Sparse Bayesian learning (SBL) is a popular machine learning approach with a superior generalization capability due to the sparsity of its adopted model. However, it entails a matrix inversion at each iteration, hindering its practical applications with large-scale data sets. To overcome this bottleneck, we propose an efficient SBL algorithm with $\mathcal {O}(n^{2})$ computational complexity per iteration based on a Gaussian-scale mixture prior model. By specifying two different hyperpriors, the proposed efficient SBL algorithm can meet two different requirements, such as high efficiency and high sparsity. A surrogate function is introduced herein to approximate the posterior density of model parameters and thereby to avoid matrix inversions. Using a data-dependent term, a joint cost function with separate penalty terms is reformulated in a joint space of model parameters and hyperparameters. The resulting nonconvex optimization problem is solved using a block coordinate descent method in a majorization–minimization framework. Finally, the results of extensive experiments for sparse signal recovery and sparse image reconstruction on benchmark problems are elaborated to substantiate the effectiveness and superiority of the proposed approach in terms of computational time and estimation error.
更多
查看译文
关键词
Gaussian scale mixture,iterative algorithms,optimization methods,regression,sparse Bayesian learning (SBL)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要