Incremental RNN: A Dynamical View
international conference on learning representations, 2020.
Recurrent neural networks (RNNs) are particularly well-suited for modeling long-term dependencies in sequential data, but are notoriously hard to train because the error backpropagated in time either vanishes or explodes at an exponential rate. While a number of works attempt to mitigate this effect through gated recurrent units, skip-con...More
Get fulltext within 24h
Full Text (Upload PDF)
PPT (Upload PPT)