Incremental RNN: A Dynamical View

Anil Kag
Anil Kag
Ziming Zhang
Ziming Zhang

international conference on learning representations, 2020.

Cited by: 0|Views9

Abstract:

Recurrent neural networks (RNNs) are particularly well-suited for modeling long-term dependencies in sequential data, but are notoriously hard to train because the error backpropagated in time either vanishes or explodes at an exponential rate. While a number of works attempt to mitigate this effect through gated recurrent units, skip-con...More

Code:

Data:

Get fulltext within 24h
Bibtex
Your rating :
0

 

Tags
Comments