Towards compact reversible image representations for neural style transfer

Xiyao Liu, Siyu Yang, Jian Zhang*, Gerald Schaefer, Jiya Li, Xunli FAN, Songtao Wu, Hui Fang* ;

Abstract


"Arbitrary neural style transfer aims to stylise a content image by referencing a provided style image. Despite various efforts to achieve both content preservation and style transferability, learning effective representations for this task remains challenging since the redundancy of content and style features leads to unpleasant image artefacts. In this paper, we learn compact neural representations for style transfer motivated from an information theoretical perspective. In particular, we enforce compressive representations across sequential modules of a reversible flow network in order to reduce feature redundancy without losing content preservation capability. We use a Barlow twins loss to reduce channel dependency and thus to provide better content expressiveness, and optimise the Jensen-Shannon divergence of style representations between reference and target images to avoid under- and over-stylisation. We comprehensively demonstrate the effectiveness of our proposed method in comparison to other state-of-the-art style transfer approaches."

Related Material


[pdf] [DOI]