Deeply-Recursive Convolutional Network for Image Super-Resolution
2015-11-14CVPR 2016Code Available0· sign in to hype
Jiwon Kim, Jung Kwon Lee, Kyoung Mu Lee
Code Available — Be the first to reproduce this paper.
ReproduceCode
Abstract
We propose an image super-resolution method (SR) using a deeply-recursive convolutional network (DRCN). Our network has a very deep recursive layer (up to 16 recursions). Increasing recursion depth can improve performance without introducing new parameters for additional convolutions. Albeit advantages, learning a DRCN is very hard with a standard gradient descent method due to exploding/vanishing gradients. To ease the difficulty of training, we propose two extensions: recursive-supervision and skip-connection. Our method outperforms previous methods by a large margin.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| BSD100 - 2x upscaling | DRCN [[Kim et al.2016b]] | PSNR | 31.85 | — | Unverified |
| BSD100 - 4x upscaling | DRCN | PSNR | 27.21 | — | Unverified |
| Set14 - 2x upscaling | DRCN [[Kim et al.2016b]] | PSNR | 33.04 | — | Unverified |
| Set14 - 4x upscaling | DRCN | PSNR | 28.02 | — | Unverified |
| Set5 - 2x upscaling | DRCN [[Kim et al.2016b]] | PSNR | 37.63 | — | Unverified |
| Urban100 - 2x upscaling | DRCN [[Kim et al.2016b]] | PSNR | 30.75 | — | Unverified |