Adaptively Truncating Backpropagation Through Time to Control Gradient Bias
UAI, pp. 2902019.
Truncated backpropagation through time (TBPTT) is a popular method for learning in recurrent neural networks (RNNs) that saves computation and memory at the cost of bias by truncating backpropagation after a fixed number of lags. In practice, choosing the optimal truncation length is difficult: TBPTT will not converge if the truncation ...More
PPT (Upload PPT)