SOTAVerified

CRS Arena: Crowdsourced Benchmarking of Conversational Recommender Systems

2024-12-13Unverified0· sign in to hype

Nolwenn Bernard, Hideaki Joko, Faegheh Hasibi, Krisztian Balog

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We introduce CRS Arena, a research platform for scalable benchmarking of Conversational Recommender Systems (CRS) based on human feedback. The platform displays pairwise battles between anonymous conversational recommender systems, where users interact with the systems one after the other before declaring either a winner or a draw. CRS Arena collects conversations and user feedback, providing a foundation for reliable evaluation and ranking of CRSs. We conduct experiments with CRS Arena on both open and closed crowdsourcing platforms, confirming that both setups produce highly correlated rankings of CRSs and conversations with similar characteristics. We release CRSArena-Dial, a dataset of 474 conversations and their corresponding user feedback, along with a preliminary ranking of the systems based on the Elo rating system. The platform is accessible at https://iai-group-crsarena.hf.space/.

Tasks

Reproductions