Fast On-Device Learning Framework for Single-Image Super-Resolution

Seok Hee Lee,Karam Park,Sunwoo Cho, Hyun-Seung Lee, Kyuha Choi,Nam Ik Cho

IEEE ACCESS(2024)

引用 0|浏览2
暂无评分
摘要
When implementing a super-resolution (SR) model on an edge device, it is common to train the model on a cloud using pre-determined training images. This is due to the lack of large-scale training data and computation power available on the edge device. However, such frameworks may encounter a domain gap issue because input images to these devices often have different characteristics than those used in training. Therefore, it is essential to continually update the model parameters through on-device learning, which takes into account the limited computation power of edge devices and makes use of on-site input images. In this paper, we present a fast and efficient on-device learning framework for an SR model that aims to overcome the challenges posed by restricted computation and domain gap issues. Specifically, we propose an architecture for training the SR model in a quantized domain, which helps to reduce the quantization errors that accumulate during training. Additionally, we propose cost-constrained gradient pruning and meta-learning-based fast training schemes to enhance restoration performance within a smaller number of iterations. Experimental results show that our approach can maintain the restoration performance for unseen inputs on a lightweight model achieved by our quantization scheme.
更多
查看译文
关键词
Training data,Quantization (signal),Superresolution,Metalearning,Image edge detection,Computational modeling,Task analysis,Image restoration,Computational efficiency,Cloud computing,Gradient pruning,meta-learning,neural network acceleration,neural network compression,neural network quantization,on-device learning,pruning,super-resolution
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要