SOTAVerified

Parameter-Efficient Interventions for Enhanced Model Merging

2024-12-22Unverified0· sign in to hype

Marcin Osial, Daniel Marczak, Bartosz Zieliński

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Model merging combines knowledge from task-specific models into a unified multi-task model to avoid joint training on all task data. However, current methods face challenges due to representation bias, which can interfere with tasks performance. As a remedy, we propose IntervMerge, a novel approach to multi-task model merging that effectively mitigates representation bias across the model using taskspecific interventions. To further enhance its efficiency, we introduce mini-interventions, which modify only part of the representation, thereby reducing the additional parameters without compromising performance. Experimental results demonstrate that IntervMerge consistently outperforms the state-of-the-art approaches using fewer parameters.

Tasks

Reproductions