Multi-Scale Implicit Transformer with Re-parameterize for Arbitrary-Scale Super-Resolution
CoRR(2024)
摘要
Recently, the methods based on implicit neural representations have shown
excellent capabilities for arbitrary-scale super-resolution (ASSR). Although
these methods represent the features of an image by generating latent codes,
these latent codes are difficult to adapt for different magnification factors
of super-resolution, which seriously affects their performance. Addressing
this, we design Multi-Scale Implicit Transformer (MSIT), consisting of an
Multi-scale Neural Operator (MSNO) and Multi-Scale Self-Attention (MSSA). Among
them, MSNO obtains multi-scale latent codes through feature enhancement,
multi-scale characteristics extraction, and multi-scale characteristics
merging. MSSA further enhances the multi-scale characteristics of latent codes,
resulting in better performance. Furthermore, to improve the performance of
network, we propose the Re-Interaction Module (RIM) combined with the
cumulative training strategy to improve the diversity of learned information
for the network. We have systematically introduced multi-scale characteristics
for the first time in ASSR, extensive experiments are performed to validate the
effectiveness of MSIT, and our method achieves state-of-the-art performance in
arbitrary super-resolution tasks.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要