False discovery rate control under reduced precision computation for analysis of neuroimaging data

arXiv: Methodology(2018)

引用 23|浏览11
暂无评分
摘要
The mitigation of false positives is an important issue when conducting multiple hypothesis testing. The most popular paradigm for false positives mitigation in high-dimensional applications is via the control of the false discovery rate (FDR). Multiple testing data from neuroimaging experiments can be very large, and reduced precision storage of such data is often required. Reduced precision computation is often a problem in the analysis of legacy data and data arising from legacy pipelines. We present a method for FDR control that is applicable in cases where only ptext{-values} or test statistics (with common and known null distribution) are available, and when those ptext{-values} or test statistics are encoded in a reduced precision format. Our method is based on an empirical-Bayes paradigm where the probit transformation of the ptext{-values} (called the ztext{-scores}) are modeled as a two-component mixture of normal distributions. Due to the reduced precision of the ptext{-values} or test statistics, the usual approach for fitting mixture models may not be feasible. We instead use a binned-data technique, which can be proved to consistently estimate the ztext{-score} distribution parameters under mild correlation assumptions, as is often the case in neuroimaging data. A simulation study shows that our methodology is competitive when compared with popular alternatives, especially with data in the presence of misspecification. We demonstrate the applicability of our methodology in practice via a brain imaging study of mice.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要