An Empirical Study of Language CNN for Image Captioning
2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV)(2017)
摘要
Language Models based on recurrent neural networks have dominated recent image caption generation tasks. In this paper, we introduce a Language CNN model which is suitable for statistical language modeling tasks and shows competitive performance in image captioning. In contrast to previous models which predict next word based on one previous word and hidden state, our language CNN is fed with all the previous words and can model the long-range dependencies of history words, which are critical for image captioning. The effectiveness of our approach is validated on two datasets MS COCO and Flickr30K. Our extensive experimental results show that our method outperforms the vanilla recurrent neural network based language models and is competitive with the state-of-the-art methods.
更多查看译文
关键词
language models,recurrent neural networks,language CNN model,statistical language modeling tasks,image captioning,history words,vanilla recurrent neural network,image caption generation tasks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络