SOTAVerified

Enhanced Bi-directional Motion Estimation for Video Frame Interpolation

2022-06-17Code Available1· sign in to hype

Xin Jin, Longhai Wu, Guotao Shen, Youxin Chen, Jie Chen, Jayoon Koo, Cheul-hee Hahm

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We present a novel simple yet effective algorithm for motion-based video frame interpolation. Existing motion-based interpolation methods typically rely on a pre-trained optical flow model or a U-Net based pyramid network for motion estimation, which either suffer from large model size or limited capacity in handling complex and large motion cases. In this work, by carefully integrating intermediateoriented forward-warping, lightweight feature encoder, and correlation volume into a pyramid recurrent framework, we derive a compact model to simultaneously estimate the bidirectional motion between input frames. It is 15 times smaller in size than PWC-Net, yet enables more reliable and flexible handling of challenging motion cases. Based on estimated bi-directional motion, we forward-warp input frames and their context features to intermediate frame, and employ a synthesis network to estimate the intermediate frame from warped representations. Our method achieves excellent performance on a broad range of video frame interpolation benchmarks. Code and trained models are available at https://github.com/srcn-ivl/EBME.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
MSU Video Frame InterpolationEBME-HPSNR28.77Unverified
MSU Video Frame InterpolationEBMEPSNR28.56Unverified
SNU-FILM (easy)EBME-H*PSNR40.28Unverified
SNU-FILM (extreme)EBME-H*PSNR25.4Unverified
SNU-FILM (hard)EBME-H*PSNR30.64Unverified
SNU-FILM (medium)EBME-H*PSNR36.07Unverified
UCF101EBME-H*PSNR35.41Unverified
Vimeo90KEBME-H*PSNR36.19Unverified
X4K1000FPSEBME-H*PSNR29.46Unverified

Reproductions