End-To-End Memory Networks

Annual Conference on Neural Information Processing Systems, pp. 2440-2448, 2015.

Cited by: 1699|Bibtex|Views198
EI
Other Links: academic.microsoft.com|dblp.uni-trier.de|dl.acm.org|arxiv.org

Abstract:

We introduce a neural network with a recurrent attention model over a possibly large external memory. The architecture is a form of Memory Network [23] but unlike the model in that work, it is trained end-to-end, and hence requires significantly less supervision during training, making it more generally applicable in realistic settings. I...More

Code:

Data:

Full Text
Your rating :
0

 

Tags
Comments