Conditional Information Gain Trellis
CoRR(2024)
摘要
Conditional computing processes an input using only part of the neural
network's computational units. Learning to execute parts of a deep
convolutional network by routing individual samples has several advantages:
Reducing the computational burden is an obvious advantage. Furthermore, if
similar classes are routed to the same path, that part of the network learns to
discriminate between finer differences and better classification accuracies can
be attained with fewer parameters. Recently, several papers have exploited this
idea to take a particular child of a node in a tree-shaped network or to skip
parts of a network. In this work, we follow a Trellis-based approach for
generating specific execution paths in a deep convolutional neural network. We
have designed routing mechanisms that use differentiable information gain-based
cost functions to determine which subset of features in a convolutional layer
will be executed. We call our method Conditional Information Gain Trellis
(CIGT). We show that our conditional execution mechanism achieves comparable or
better model performance compared to unconditional baselines, using only a
fraction of the computational resources.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要