Rates and predictors of data and code sharing in the medical and health sciences: A systematic review with meta-analysis of individual participant data

medRxiv (Cold Spring Harbor Laboratory)(2023)

引用 3|浏览6
暂无评分
摘要
Objectives Many meta-research studies have investigated rates and predictors of data and code sharing in medicine. However, most of these studies have been narrow in scope and modest in size. We aimed to synthesise the findings of this body of research to provide an accurate picture of how common data and code sharing is, how this frequency has changed over time, and what factors are associated with sharing. Design Systematic review with meta-analysis of individual participant data (IPD) from meta-research studies. Data sources: Ovid MEDLINE, Ovid Embase, MetaArXiv, medRxiv, and bioRxiv were searched from inception to July 1st, 2021. Eligibility criteria Studies that investigated data or code sharing across a sample of scientific articles presenting original medical and health research. Data extraction and synthesis Two authors independently screened records, assessed risk of bias, and extracted summary data from study reports. IPD were requested from authors when not publicly available. Key outcomes of interest were the prevalence of statements that declared data or code were publicly available, or ‘available on request’ (declared availability), and the success rates of retrieving these products (actual availability). The associations between data and code availability and several factors (e.g., journal policy, data type, study design, research subjects) were also examined. A two-stage approach to IPD meta-analysis was performed, with proportions and risk ratios pooled using the Hartung-Knapp-Sidik-Jonkman method for random-effects meta-analysis. Three-level random-effects meta-regressions were also performed to evaluate the influence of publication year on sharing rate. Results 105 meta-research studies examining 2,121,580 articles across 31 specialties were included in the review. Eligible studies examined a median of 195 primary articles (IQR: 113-475), with a median publication year of 2015 (IQR: 2012-2018). Only eight studies (8%) were classified as low risk of bias. Useable IPD were assembled for 100 studies (2,121,197 articles), of which 94 datasets passed independent reproducibility checks. Meta-analyses revealed declared and actual public data availability rates of 8% (95% CI: 5-11%, 95% PI: 0-30%, k=27, o=700,054) and 2% (95% CI: 1-3%, 95% PI: 0-11%, k=25, o=11,873) respectively since 2016. Meta-regression indicated that only declared data sharing rates have increased significantly over time. For public code sharing, both declared and actual availability rates were estimated to be less than 0.5% since 2016, and neither demonstrated any meaningful increases over time. Only 33% of authors (95% CI: 5-69%, k=3, o=429) were estimated to comply with mandatory data sharing policies of journals. Conclusion Code sharing remains persistently low across medicine and health research. In contrast, declarations of data sharing are also low, but they are increasing. However, they do not always correspond to the actual sharing of data. Mandatory data sharing policies of journals may also not be as effective as expected, and may vary in effectiveness according to data type - a finding that may be informative for policymakers when designing policies and allocating resources to audit compliance. ### Competing Interest Statement The authors have declared no competing interest. ### Clinical Protocols ### Funding Statement No funding was received for this study. DGH is a PhD candidate supported by an Australian Commonwealth Government Research Training Program Scholarship. MJP is supported by an Australian Research Council Discovery Early Career Researcher Award (DE200101618). ### Author Declarations I confirm all relevant ethical guidelines have been followed, and any necessary IRB and/or ethics committee approvals have been obtained. Yes I confirm that all necessary patient/participant consent has been obtained and the appropriate institutional forms have been archived, and that any patient/participant/sample identifiers included were not known to anyone (e.g., hospital staff, patients or participants themselves) outside the research group so cannot be used to identify individuals. Yes I understand that all clinical trials and any other prospective interventional studies must be registered with an ICMJE-approved registry, such as ClinicalTrials.gov. I confirm that any such study reported in the manuscript has been registered and the trial registration ID is provided (note: if posting a prospective study registered retrospectively, please provide a statement in the trial ID field explaining why the study was not registered in advance). Yes I have followed all appropriate research reporting guidelines, such as any relevant EQUATOR Network research reporting checklist(s) and other pertinent material, if applicable. Yes Summary level data and the code required to reproduce all the findings of the review are freely available on the Open Science Framework (DOI: 10.17605/OSF.IO/U3YRP) under a Creative Commons Zero v1.0 Universal (CC0 1.0) license. Harmonised versions of IPD that were originally made publicly available can be shared on request, whereas to preserve the rights of data owners, harmonised versions of IPD that were shared privately with the review team will only be released with the permission of the data guarantor of the relevant meta-research study. To request harmonised IPD please follow the instructions on the project's Open Science Framework page ().
更多
查看译文
关键词
code sharing,meta-analysis meta-analysis,individual participant data,systematic review,health sciences
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要