Innovation and forward-thinking are needed to improve traditional synthesis methods: A response to Pescott and Stewart

JOURNAL OF APPLIED ECOLOGY(2022)

引用 2|浏览18
暂无评分
摘要
In Christie et al. (2019), we used simulations to quantitatively compare the bias of commonly used study designs in ecology and conservation. Based on these simulations, we proposed 'accuracy weights' as a potential way to account for study design validity in meta-analytic weighting methods. Pescott and Stewart (2022) raised concerns that these weights may not be generalisable and still lead to biased meta-estimates. Here we respond to their concerns and demonstrate why developing alternative weighting methods is key to the future of evidence synthesis. We acknowledge that our simple simulation unfairly penalised randomised controlled trial (RCT) relative to before-after control-impact (BACI) designs as we considered that the parallel trends assumption held for BACI designs. We point to an empirical follow-up study in which we more fairly quantify differences in biases between different study designs. However, we stand by our main findings that before-after (BA), control-impact (CI) and after designs are quantifiably more biased than BACI and RCT designs. We also emphasise that our 'accuracy weighting' method was preliminary and welcome future research to incorporate more dimensions of study quality. We further show that over a decade of advances in quality effect modelling, which Pescott and Stewart (2022) omit, highlights the importance of research such as ours in better understanding how to quantitatively integrate data on study quality directly into meta-analyses. We further argue that the traditional methods advocated for by Pescott and Stewart (2022; e.g. manual risk-of-bias assessments and inverse-variance weighting) are subjective, wasteful and potentially biased themselves. They also lack scalability for use in large syntheses that keep up-to-date with the rapidly growing scientific literature. Synthesis and applications. We suggest, contrary to Pescott and Stewart's narrative, that moving towards alternative weighting methods is key to future-proofing evidence synthesis through greater automation, flexibility and updating to respond to decision-makers' needs-particularly in crisis disciplines in conservation science where problematic biases and variability exist in study designs, contexts and metrics used. While we must be cautious to avoid misinforming decision-makers, this should not stop us investigating alternative weighting methods that integrate study quality data directly into meta-analyses. To reliably and pragmatically inform decision-makers with science, we need efficient, scalable, readily automated and feasible methods to appraise and weight studies to produce large-scale living syntheses of the future.
更多
查看译文
关键词
automation, bias adjustment, critical appraisal, dynamic meta-analyses, evidence synthesis, living reviews, quality effects modelling, risk of bias
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要