SOTAVerified

Combinatorial Multi-armed Bandits for Real-Time Strategy Games

2017-10-13Unverified0· sign in to hype

Santiago Ontañón

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Games with large branching factors pose a significant challenge for game tree search algorithms. In this paper, we address this problem with a sampling strategy for Monte Carlo Tree Search (MCTS) algorithms called na\"ive sampling, based on a variant of the Multi-armed Bandit problem called Combinatorial Multi-armed Bandits (CMAB). We analyze the theoretical properties of several variants of na\"ive sampling, and empirically compare it against the other existing strategies in the literature for CMABs. We then evaluate these strategies in the context of real-time strategy (RTS) games, a genre of computer games characterized by their very large branching factors. Our results show that as the branching factor grows, na\"ive sampling outperforms the other sampling strategies.

Tasks

Reproductions