Incorporation of Contextual Information into BERT for Dialog Act Classification in Japanese

2021 16th International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP)(2021)

引用 0|浏览3
暂无评分
摘要
Recently developed Bidirectional Encoder Representations from Transformers (BERT) outperforms the state-of-the-art in many natural language processing tasks in English. Although contextual information is known to be useful for dialog act classification, fine-tuning BERT with contextual information has not been investigated, especially in head final languages such as Japanese. This paper investigat...
更多
查看译文
关键词
Bit error rate,Transformers,Natural language processing,Reliability,Task analysis,Artificial intelligence,Context modeling
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要