Exploring multidimensional lstms for large vocabulary ASR
ICASSP, pp. 4940-4944, 2016.
EI
Abstract:
Long short-term memory (LSTM) recurrent neural networks (RNNs) have recently shown significant performance improvements over deep feed-forward neural networks. A key aspect of these models is the use of time recurrence, combined with a gating architecture that allows them to track the long-term dynamics of speech. Inspired by human spectr...More
Code:
Data:
Tags
Comments