Protein Stability Prediction by Fine-tuning a Protein Language Model on a Mega-scale Dataset

Kit Sang Chu,Justin B Siegel

biorxiv(2023)

引用 0|浏览6
暂无评分
摘要
The stability of a protein is crucial to its utility in industrial applications. While engineering campaigns can now be routinely used to enhance protein thermal stability to the level needed in an industrial setting, there is a significant desire to fast-track these efforts through predictive tools allowing one to jump in a minimal number of design iterations to a highly stabilized protein. In this work, we explore utilizing a mega-scale dataset for development of a protein language model tuned for stability. This model is trained on the folding stability of 528k sequences derived from 461 small protein domains and designs, and can accommodate deletions, insertions, and multiple-point mutations. We show that a protein language model can be fine-tuned to predict folding stability. The fine-tuned protein language model, named ESMtherm, performs reasonably on small protein domains and generalizes to sequences distal from the training set. Lastly, we discuss its limitations when compared to other state-of-the-art methods in generalizing to larger protein scaffolds and highlight the need of large-scale stability measurement on a diverse dataset that represents the distribution of sequence lengths commonly observed in nature. ### Competing Interest Statement The authors have declared no competing interest.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要