A Formal Hierarchy of RNN Architectures

ACL, pp. 443-459, 2020.

Cited by: 12|Views128
EI
Weibo:
While this means existing rational recurrent neural network are fundamentally limited compared to long short-term memory networks, we find that it is not necessarily being rationally recurrent that limits them: we prove that a WFA can perfectly encode its input—something no satur...

Abstract:

We develop a formal hierarchy of the expressive capacity of RNN architectures. The hierarchy is based on two formal properties: space complexity, which measures the RNN's memory, and rational recurrence, defined as whether the recurrent update can be described by a weighted finite-state machine. We place several RNN variants within this...More
0
Your rating :
0

 

Tags
Comments