Capacity of Neural Networks for Lifelong Learning of Composable Tasks

2017 IEEE 58th Annual Symposium on Foundations of Computer Science (FOCS)(2017)

引用 3|浏览31
暂无评分
摘要
We investigate neural circuits in the exacting setting that (i) the acquisition of a piece of knowledge can occur from a single interaction, (ii) the result of each such interaction is a rapidly evaluatable subcircuit, (iii) hundreds of thousands of such subcircuits can be acquired in sequence without substantially degrading the earlier ones, and (iv) recall can be in the form of a rapid evaluation of a composition of subcircuits that have been so acquired at arbitrary different earlier times.We develop a complexity theory, in terms of asymptotically matching upper and lower bounds, on the capacity of a neural network for executing, in this setting, the following action, which we call {\it association}: Each action sets up a subcircuit so that the excitation of a chosen set of neurons A will in future cause the excitation of another chosen set B.% As model of computation we consider the neuroidal model, a fully distributed model in which the quantitative resources n, the neuron numbers, d, the number of other neurons each neuron is connected to, and k, the inverse of the maximum synaptic strength, are all accounted for.A succession of experiences, possibly over a lifetime, results in the realization of a complex set of subcircuits. The composability requirement constrains the model to ensure that, for each association as realized by a subcircuit, the excitation in the triggering set of neurons A is quantitatively similar to that in the triggered set B, and also that the unintended excitation in the rest of the system is negligible. These requirements ensure that chains of associations can be triggeredWe first analyze what we call the Basic Mechanism, which uses only direct connections between neurons in the triggering set A and the target set B. We consider random networks of n neurons with expected number d of connections to and from each. We show that in the composable context capacity growth is limited by d 2 , a severe limitation if the network is sparse, as it is in cortex. We go on to study the Expansive Mechanism, that additionally uses intermediate relay neurons which have high synaptic weights. For this mechanism we show that the capacity can grow as dn, to within logarithmic factors. From these two results it follows that in the composable regime, for the realistic cortical estimate of d=n 1/2 , superlinear capacity of order n 3/2 in terms of the neuron numbers can be realized by the Expansive Mechanism, instead of the linear order n to which the Basic Mechanism is limited. More generally, for both mechanisms, we establish matching upper and lower bounds on capacity in terms of the parameters n, d, and the inverse maximum synaptic strength k.The results as stated above assume that in a set of associations, a target B can be triggered by at most one set A. It can be shown that the capacities are similar if the number m of As that can trigger a B is greater than one but small, but become severely constrained if m exceeds a certain threshold.
更多
查看译文
关键词
neuroscience,neural computation,neuroidal model,computational complexity,associations
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要