Encoding of sequential translators in discrete-time recurrent neural nets

ESANN(1999)

引用 23|浏览20
暂无评分
摘要
In recent years, there has been a lot of interest in the use of discrete-time recurrent neural nets (DTRNN) to learn nite-state tasks, and in the computational power of DTRNN, particularly in connection with nite-state computation. This paper describes a simple strategy to devise stable encodings of sequential nite-state translators (SFST) in a second-order DTRNN with units having bounded, strictly growing, continuous sigmoid activation functions. The strategy relies on bounding criteria based on a study of the conditions under which the DTRNN is actually behaving as a SFST.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要