RECURRENT NEURAL NETWORKS AS WEIGHTED LANGUAGE RECOGNIZERS
north american chapter of the association for computational linguistics, pp. 2261-2271, 2018.
We investigate the computational complexity of various problems for simple recurrent neural networks (RNNs) as formal models for recognizing weighted languages. We focus on the single-layer, ReLU-activation, rational-weight RNNs with softmax, which are commonly used in natural language processing applications. We show that most problems...More
PPT (Upload PPT)