Hierarchical Attention Fusion Network for Question Answering

Yang Chen, Marius Seritan

semanticscholar(2019)

引用 0|浏览0
暂无评分
摘要
[1] Seo, M., Kembhavi, A., Farhadi, A., etc. (2016). Bidirectional attention flow for machine comprehension. arXiv preprint arXiv:1611.01603. [2] Kim, Y., Jernite, Y., Sontag, D., etc. (2016). Character-aware neural language models. In 13th AAAI Conference on Artificial Intelligence. [3] Wang, W., Yan, M., Wu, C. (2018). Multi-Granularity Hierarchical Attention Fusion Networks for Reading Comprehension and Question Answering. arXiv preprint arXiv:1811.11934. ● Goal: Build a Question Answering model on Stanford Question Answering Dataset (SQuAD) ● Data: Default SQuAD train, dev, test set provided by TA ● Evaluation: F1, EM
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要