FlowerFormer: Empowering Neural Architecture Encoding using a Flow-aware Graph Transformer
CVPR 2024(2024)
摘要
The success of a specific neural network architecture is closely tied to the
dataset and task it tackles; there is no one-size-fits-all solution. Thus,
considerable efforts have been made to quickly and accurately estimate the
performances of neural architectures, without full training or evaluation, for
given tasks and datasets. Neural architecture encoding has played a crucial
role in the estimation, and graphbased methods, which treat an architecture as
a graph, have shown prominent performance. For enhanced representation learning
of neural architectures, we introduce FlowerFormer, a powerful graph
transformer that incorporates the information flows within a neural
architecture. FlowerFormer consists of two key components: (a) bidirectional
asynchronous message passing, inspired by the flows; (b) global attention built
on flow-based masking. Our extensive experiments demonstrate the superiority of
FlowerFormer over existing neural encoding methods, and its effectiveness
extends beyond computer vision models to include graph neural networks and auto
speech recognition models. Our code is available at
http://github.com/y0ngjaenius/CVPR2024_FLOWERFormer.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要