Shrinkage with Shrunken Shoulders: Gibbs Sampling Shrinkage Model Posteriors with Guaranteed Convergence Rates

BAYESIAN ANALYSIS(2023)

引用 2|浏览5
暂无评分
摘要
Use of continuous shrinkage priors - with a "spike" near zero and heavy-tails towards infinity - is an increasingly popular approach to induce spar-sity in parameter estimates. When the parameters are only weakly identified by the likelihood, however, the posterior may end up with tails as heavy as the prior, jeopardizing robustness of inference. A natural solution is to "shrink the shoulders" of a shrinkage prior by lightening up its tails beyond a reasonable parameter range, yielding a regularized version of the prior. We develop a regularization approach which, unlike previous proposals, preserves computationally attractive structures of original shrinkage priors. We study theoretical properties of the Gibbs sam-pler on resulting posterior distributions, with emphasis on convergence rates of the Po ' lya-Gamma Gibbs sampler for sparse logistic regression. Our analysis shows that the proposed regularization leads to geometric ergodicity under a broad range of global-local shrinkage priors. Essentially, the only requirement is for the prior rlocal(center dot) on the local scale A to satisfy rlocal(0) < oo. If rlocal(center dot) further satisfies lim lambda -> 0 rlocal(A)/Aa < oo for a > 0, as in the case of Bayesian bridge priors, we show the sampler to be uniformly ergodic.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要