Chrome Extension
WeChat Mini Program
Use on ChatGLM

Learned Image Compression with Inception Residual Blocks and Multi-Scale Attention Module.

Picture Coding Symposium(2022)

Cited 0|Views14
No score
Abstract
Recently, deep learning-based image compression methods have achieved superior performance compared to traditional methods. However, the complexity of the leading scheme is still quite high in both the core network and the entropy coding. In this paper, we propose two efficient modules. First, we adopt an inception residual block (IRB) in the core network, which has lower complexity than previous non-local attention module and concatenated residual blocks. Second, we employ a multi-scale attention module (MSAM), which aggregates features from three different scales to capture the global information. The output of MSAM is used as importance map to guide bits allocation. In addition, the simple Gaussian mixture model is used in the entropy coding, instead of more complicated models. Experimental results demonstrate that the encoding and decoding of our method are about 17 times faster than the state-of-the-art method. Although the R-D performance drops slightly, the performance is still better than H.266/VVC (4:4:4) and other recent learning-based methods.
More
Translated text
Key words
Learned image compression,inception residual block,multi-scale attention module
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined