End-To-End Memory Networks
Annual Conference on Neural Information Processing Systems, pp. 2440-2448, 2015.
We introduce a neural network with a recurrent attention model over a possibly large external memory. The architecture is a form of Memory Network  but unlike the model in that work, it is trained end-to-end, and hence requires significantly less supervision during training, making it more generally applicable in realistic settings. I...More
PPT (Upload PPT)