SOTAVerified

The Impact of Negative Sampling on Contrastive Structured World Models

2021-07-24Code Available0· sign in to hype

Ondrej Biza, Elise van der Pol, Thomas Kipf

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

World models trained by contrastive learning are a compelling alternative to autoencoder-based world models, which learn by reconstructing pixel states. In this paper, we describe three cases where small changes in how we sample negative states in the contrastive loss lead to drastic changes in model performance. In previously studied Atari datasets, we show that leveraging time step correlations can double the performance of the Contrastive Structured World Model. We also collect a full version of the datasets to study contrastive learning under a more diverse set of experiences.

Tasks

Reproductions