Extensions of recurrent neural network language model
Acoustics, Speech and Signal Processing(2011)
摘要
We present several modifications of the original recurrent neural net work language model (RNN LM). While this model has been shown to significantly outperform many competitive language modeling techniques in terms of accuracy, the remaining problem is the computational complexity. In this work, we show approaches that lead to more than 15 times speedup for both training and testing phases. Next, we show importance of using a backpropagation through time algorithm. An empirical comparison with feedforward networks is also provided. In the end, we discuss possibilities how to reduce the amount of parameters in the model. The resulting RNN model can thus be smaller, faster both during training and testing, and more accurate than the basic one.
更多查看译文
关键词
backpropagation,computational complexity,feedforward neural nets,natural language processing,recurrent neural nets,backpropagation,competitive language modeling techniques,computational complexity,feedforward network,recurrent neural network language model,language modeling,recurrent neural networks,speech recognition
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络