RECURRENT NEURAL NETWORKS AS WEIGHTED LANGUAGE RECOGNIZERS

north american chapter of the association for computational linguistics, pp. 2261-2271, 2018.

Cited by: 22|Views31

Abstract:

We investigate the computational complexity of various problems for simple recurrent neural networks (RNNs) as formal models for recognizing weighted languages. We focus on the single-layer, ReLU-activation, rational-weight RNNs with softmax, which are commonly used in natural language processing applications. We show that most problems...More

Code:

Data:

Your rating :
0

 

Tags
Comments