SOTAVerified

Autonomous Learning of Object-Centric Abstractions for High-Level Planning

2021-01-01ICLR 2022Unverified0· sign in to hype

Steven James, Benjamin Rosman, George Konidaris

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We propose a method for autonomously learning an object-centric representation of a continuous and high-dimensional environment that is suitable for planning. Such representations can immediately be transferred between tasks that share the same types of objects, resulting in agents that require fewer samples to learn a model of a new task. We first demonstrate our approach on a simple domain where the agent learns a compact, lifted representation that generalises across objects. We then apply it to a series of Minecraft tasks to learn object-centric representations, including object types—directly from pixel data—that can be leveraged to solve new tasks quickly. The resulting learned representations enable the use of a task-level planner, resulting in an agent capable of forming complex, long-term plans with considerably fewer environment interactions.

Tasks

Reproductions