Binding Actions to Objects in World Models
Ondrej Biza, Robert Platt, Jan-Willem van de Meent, Lawson L. S. Wong, Thomas Kipf
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/ondrejba/action_attention_iclr_22OfficialIn paperpytorch★ 8
Abstract
We study the problem of binding actions to objects in object-factored world models using action-attention mechanisms. We propose two attention mechanisms for binding actions to objects, soft attention and hard attention, which we evaluate in the context of structured world models for five environments. Our experiments show that hard attention helps contrastively-trained structured world models to learn to separate individual objects in an object-based grid-world environment. Further, we show that soft attention increases performance of factored world models trained on a robotic manipulation task. The learned action attention weights can be used to interpret the factored world model as the attention focuses on the manipulated object in the environment.