SOTAVerified

ReCoNet: Real-time Coherent Video Style Transfer Network

2018-07-03Code Available0· sign in to hype

Chang Gao, Derun Gu, Fangjun Zhang, Yizhou Yu

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Image style transfer models based on convolutional neural networks usually suffer from high temporal inconsistency when applied to videos. Some video style transfer models have been proposed to improve temporal consistency, yet they fail to guarantee fast processing speed, nice perceptual style quality and high temporal consistency at the same time. In this paper, we propose a novel real-time video style transfer model, ReCoNet, which can generate temporally coherent style transfer videos while maintaining favorable perceptual styles. A novel luminance warping constraint is added to the temporal loss at the output level to capture luminance changes between consecutive frames and increase stylization stability under illumination effects. We also propose a novel feature-map-level temporal loss to further enhance temporal consistency on traceable objects. Experimental results indicate that our model exhibits outstanding performance both qualitatively and quantitatively.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
FMB DatasetReCoNet (RGB-Infrared)mIoU50.9Unverified

Reproductions