基本信息
浏览量:60

个人简介
Along with my advisor, I work on problems that lie at the intersection of natural language processing (NLP) and deep learning. My research interests lie broadly in using unsupervised learning, semi-supervised learning, and transfer learning methods that exploit large volumes of unlabeled data or beneficial relationships among tasks to improve learning in target tasks with limited or no labeled data available. Research directions in this vein that I am currently exploring include:
understanding and improving representations from pre-trained language models
improving pre-training and fine-tuning methods (learning objectives, model architectures, training data, etc.) to make language models more robust and stable, especially in low-data scenarios
exploring and improving transferability between NLP tasks
leveraging data augmentation and self-supervised/semi-supervised learning methods to improve generalization and robustness
exploring parameter-efficient approaches (e.g., prompt-based learning) for applying large-scale pre-trained language models to downstream tasks
研究兴趣
论文共 14 篇作者统计合作学者相似作者
按年份排序按引用量排序主题筛选期刊级别筛选合作者筛选合作机构筛选
时间
引用量
主题
期刊级别
合作者
合作机构
Sheng Shen,Le Hou, Yanqi Zhou,Nan Du, Shayne Longpre,Jason Wei,Hyung Won Chung,Barret Zoph,William Fedus,Xinyun Chen,Tu Vu,Yuexin Wu,
CoRR (2023)
引用3浏览0EI引用
3
0
arxiv(2022)
引用1浏览0EI引用
1
0
加载更多
作者统计
合作学者
合作机构
D-Core
- 合作者
- 学生
- 导师
数据免责声明
页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果,我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问,可以通过电子邮件方式联系我们:report@aminer.cn