Sequence Learning with Incremental Higher-Order Neural Networks

Sequence Learning with Incremental Higher-Order Neural Networks(1993)

引用 24|浏览27
暂无评分
摘要
An incremental, higher-order, non-recurrent neural-network combines two properties found to be useful for sequence learning in neural-networks: higher-order connections and the incremental introduction of new units. The incremental, higher-order neural-network adds higher orders when needed by adding new units that dynamically modify connection weights. The new units modify the weights at the next time-step with information from the previous step. Since a theoretically unlimited number of units can be added to the network, information from the arbitrarily distant past can be brought to bear on each prediction. Temporal tasks can thereby be learned without the use of feedback, in contrast to recurrent neural-networks. Because there are no recurrent connections, training is simple and fast. Experiments have demonstrated speedups of two orders of magnitude over recurrent networks.
更多
查看译文
关键词
recurrent network,higher-order neural-network,higher order,connection weight,higher-order connection,incremental higher-order neural networks,new unit,sequence learning,non-recurrent neural-network,recurrent connection,incremental introduction,distant past
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要