SOTAVerified

Generative Refocusing: Flexible Defocus Control from a Single Image

2026-03-18Code Available3· sign in to hype

Chun-Wei Tuan Mu, Cheng-De Fan, Jia-Bin Huang, Yu-Lun Liu

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Depth-of-field control is essential in photography, but achieving perfect focus often requires multiple attempts or specialized equipment. Single-image refocusing is still difficult. It involves recovering sharp content and creating realistic bokeh. Current methods have significant drawbacks. They require all-in-focus inputs, rely on synthetic data from simulators, and have limited control over the aperture. We introduce Generative Refocusing, a two-step process that uses DeblurNet to recover all-in-focus images from diverse inputs and BokehNet to create controllable bokeh. This method combines synthetic and real bokeh images to achieve precise control while preserving authentic optical characteristics. Our experiments show we achieve top performance in defocus deblurring, bokeh synthesis, and refocusing benchmarks. Additionally, our Generative Refocusing allows custom aperture shapes. Project page: https://generative-refocusing.github.io/

Reproductions