SOTAVerified

Online Clustering of Bandits

2014-01-31Unverified0· sign in to hype

Claudio Gentile, Shuai Li, Giovanni Zappella

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We introduce a novel algorithmic approach to content recommendation based on adaptive clustering of exploration-exploitation ("bandit") strategies. We provide a sharp regret analysis of this algorithm in a standard stochastic noise setting, demonstrate its scalability properties, and prove its effectiveness on a number of artificial and real-world datasets. Our experiments show a significant increase in prediction performance over state-of-the-art methods for bandit problems.

Tasks

Reproductions