Prediction of pan-solid tumor pembrolizumab benefit by integrating tumor mutation and gene expression profiling
Research Square (Research Square)(2022)
Strata Oncology | Ochsner Cancer Institute | University of Wisconsin–Madison | University of Alabama at Birmingham | Prisma Health Greenville Memorial Hospital | Lineberger Comprehensive Cancer Center | ancer Care and Research Center | Aurora Cancer Care | Kaiser Permanente Southern California | SCL Health-CO | Kaiser Permanente Colorado | Gundersen Health System | Waukesha Memorial Hospital | Kaiser Permanente of the Mid-Atlantic States | Kaiser Permanente - Northern California | Bon Secours St. Franci | Lehigh Valley Health Network | MultiCare Regional Cancer Center
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance

被引用5557 | 浏览
被引用28 | 浏览
被引用67 | 浏览
被引用3389 | 浏览
被引用1113 | 浏览
被引用11 | 浏览
被引用385 | 浏览
被引用155 | 浏览
被引用160 | 浏览
被引用14 | 浏览
被引用137 | 浏览
被引用41 | 浏览
被引用11 | 浏览
被引用12 | 浏览
被引用19 | 浏览
被引用48 | 浏览
被引用102 | 浏览