Arbitrary style transfer method with attentional feature distribution matching

Bin Ge, Zhenshan Hu,Chenxing Xia, Junming Guan

Multimedia Systems(2024)

引用 0|浏览3
暂无评分
摘要
Most arbitrary style transfer methods only consider transferring the features of the style and content images. Although the pixel-wise style transfer is achieved. It is limited to preserving the content structure, the model tends to transfer the style features, and the loss of image information occurs during the transfer process. The model incline to transfer the style features and preservation of the content structure is weak. The generated pictures will produce artifacts and patterns of style pictures. In this paper, an attention feature distribution matching method for arbitrary style transfer is proposed. In network architecture, a combination of the self-attention mechanism and second-order statistics is used to perform style transfer, and the style strengthen block enhances the style features of generated images. In the loss function, the traditional content loss is not used. We integrate the attention mechanism and feature distribution matching to construct the loss function. The constraints are strengthened to avoid artifacts in the generated image. Qualitative and quantitative experiments demonstrate the effectiveness of our method compared with state-of-the-art arbitrary style transfer in improving arbitrary style transfer quality.
更多
查看译文
关键词
Style transfer,Attention mechanism,Image generation,Feature matching
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要