SOTAVerified

Compressing Facial Makeup Transfer Networks by Collaborative Distillation and Kernel Decomposition

2020-09-16Code Available1· sign in to hype

Bianjiang Yang, Zi Hui, Haoji Hu, Xinyi Hu, Lu Yu

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Although the facial makeup transfer network has achieved high-quality performance in generating perceptually pleasing makeup images, its capability is still restricted by the massive computation and storage of the network architecture. We address this issue by compressing facial makeup transfer networks with collaborative distillation and kernel decomposition. The main idea of collaborative distillation is underpinned by a finding that the encoder-decoder pairs construct an exclusive collaborative relationship, which is regarded as a new kind of knowledge for low-level vision tasks. For kernel decomposition, we apply the depth-wise separation of convolutional kernels to build a light-weighted Convolutional Neural Network (CNN) from the original network. Extensive experiments show the effectiveness of the compression method when applied to the state-of-the-art facial makeup transfer network -- BeautyGAN.

Tasks

Reproductions