nach0: Multimodal Natural and Chemical Languages Foundation Model
arxiv(2023)
摘要
Large Language Models (LLMs) have substantially driven scientific progress in
various domains, and many papers have demonstrated their ability to tackle
complex problems with creative solutions. Our paper introduces a new foundation
model, nach0, capable of solving various chemical and biological tasks:
biomedical question answering, named entity recognition, molecular generation,
molecular synthesis, attributes prediction, and others. nach0 is a multi-domain
and multi-task encoder-decoder LLM pre-trained on unlabeled text from
scientific literature, patents, and molecule strings to incorporate a range of
chemical and linguistic knowledge. We employed instruction tuning, where
specific task-related instructions are utilized to fine-tune nach0 for the
final set of tasks. To train nach0 effectively, we leverage the NeMo framework,
enabling efficient parallel optimization of both base and large model versions.
Extensive experiments demonstrate that our model outperforms state-of-the-art
baselines on single-domain and cross-domain tasks. Furthermore, it can generate
high-quality outputs in molecular and textual formats, showcasing its
effectiveness in multi-domain setups.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要