SOTAVerified

Random Network Distillation as a Diversity Metric for Both Image and Text Generation

2020-10-13Unverified0· sign in to hype

Liam Fowl, Micah Goldblum, Arjun Gupta, Amr Sharaf, Tom Goldstein

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Generative models are increasingly able to produce remarkably high quality images and text. The community has developed numerous evaluation metrics for comparing generative models. However, these metrics do not effectively quantify data diversity. We develop a new diversity metric that can readily be applied to data, both synthetic and natural, of any type. Our method employs random network distillation, a technique introduced in reinforcement learning. We validate and deploy this metric on both images and text. We further explore diversity in few-shot image generation, a setting which was previously difficult to evaluate.

Tasks

Reproductions