Universal Style Transfer via Feature Transforms
Yijun Li, Chen Fang, Jimei Yang, Zhaowen Wang, Xin Lu, Ming-Hsuan Yang
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/Yijunmaverick/UniversalStyleTransferOfficialIn papertf★ 0
- github.com/eridgd/WCT-TFtf★ 323
- github.com/viriditass/Style-transferpytorch★ 0
- github.com/smaranjitghose/DeepHolinone★ 0
- github.com/lihengbit/Pytorch1.4-WCTpytorch★ 0
- github.com/tcmxx/CNTKUnityToolsnone★ 0
- github.com/liamheng/pytorch1.4-wctpytorch★ 0
- github.com/EndyWon/Deep-Feature-Perturbationpytorch★ 0
- github.com/yotharit/image_style_transfertf★ 0
- github.com/rajatmodi62/PytorchWCTpytorch★ 0
Abstract
Universal style transfer aims to transfer arbitrary visual styles to content images. Existing feed-forward based methods, while enjoying the inference efficiency, are mainly limited by inability of generalizing to unseen styles or compromised visual quality. In this paper, we present a simple yet effective method that tackles these limitations without training on any pre-defined styles. The key ingredient of our method is a pair of feature transforms, whitening and coloring, that are embedded to an image reconstruction network. The whitening and coloring transforms reflect a direct matching of feature covariance of the content image to a given style image, which shares similar spirits with the optimization of Gram matrix based cost in neural style transfer. We demonstrate the effectiveness of our algorithm by generating high-quality stylized images with comparisons to a number of recent methods. We also analyze our method by visualizing the whitened features and synthesizing textures via simple feature coloring.