PPML-Omics: a Privacy-Preserving federated Machine Learning method protects patients’ privacy in omic data

bioRxiv (Cold Spring Harbor Laboratory)(2022)

引用 7|浏览5
暂无评分
摘要
Abstract Modern machine learning models towards various tasks with omic data analysis give rise to threats of privacy leakage of patients involved in those datasets. Despite the advances in different privacy technologies, existing methods tend to introduce too much computational cost (e.g. cryptographic methods) or noise (e.g. differential privacy), which hampers either model usefulness or accuracy in protecting privacy in biological data. Here, we proposed a secure and privacy-preserving machine learning method (PPML-Omics) by designing a decentralized version of the differential private federated learning algorithm. We applied PPML-Omics to analyze data from three sequencing technologies, and addressed the privacy concern in three major tasks of omic data, namely cancer classification with bulk RNA-seq, clustering with single-cell RNA-seq, and the integration of spatial gene expression and tumour morphology with spatial transcriptomics, under three representative deep learning models. We also examined privacy breaches in depth through privacy attack experiments and demonstrated that PPML-Omics could protect patients’ privacy. In each of these applications, PPML-Omics was able to outperform methods of comparison under the same level of privacy guarantee, demonstrating the versatility of the method in simultaneously balancing the privacy-preserving capability and utility in practical omic data analysis. Furthermore, we gave the theoretical proof of the privacy-preserving capability of PPML-Omics, suggesting the first mathematically guaranteed method with robust and generalizable empirical performance in protecting patients’ privacy in omic data.
更多
查看译文
关键词
machine learning,machine learning method,data,ppml-omics,privacy-preserving
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要