Arbitrary style transfer method with attentional feature distribution matching

Bin Ge, Zhenshan Hu,Chenxing Xia, Junming Guan

Multimedia Systems(2024)

Cited 0|Views8
No score
Most arbitrary style transfer methods only consider transferring the features of the style and content images. Although the pixel-wise style transfer is achieved. It is limited to preserving the content structure, the model tends to transfer the style features, and the loss of image information occurs during the transfer process. The model incline to transfer the style features and preservation of the content structure is weak. The generated pictures will produce artifacts and patterns of style pictures. In this paper, an attention feature distribution matching method for arbitrary style transfer is proposed. In network architecture, a combination of the self-attention mechanism and second-order statistics is used to perform style transfer, and the style strengthen block enhances the style features of generated images. In the loss function, the traditional content loss is not used. We integrate the attention mechanism and feature distribution matching to construct the loss function. The constraints are strengthened to avoid artifacts in the generated image. Qualitative and quantitative experiments demonstrate the effectiveness of our method compared with state-of-the-art arbitrary style transfer in improving arbitrary style transfer quality.
Translated text
Key words
Style transfer,Attention mechanism,Image generation,Feature matching
AI Read Science
Must-Reading Tree
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined