SOTAVerified

ConIsI: A Contrastive Framework with Inter-sentence Interaction for Self-supervised Sentence Representation

2022-10-01CCL 2022Unverified0· sign in to hype

Sun Meng, Huang Degen

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

“Learning sentence representation is a fundamental task in natural language processing and has been studied extensively. Recently, many works have obtained high-quality sentence representation based on contrastive learning from pre-trained models. However, these works suffer the inconsistency of input forms between the pre-training and fine-tuning stages. Also, they typically encode a sentence independently and lack feature interaction between sentences. To conquer these issues, we propose a novel Contrastive framework with Inter-sentence Interaction (ConIsI), which introduces a sentence-level objective to improve sentence representation based on contrastive learning by fine-grained interaction between sentences. The sentence-level objective guides the model to focus on fine-grained semantic information by feature interaction between sentences, and we design three different sentence construction strategies to explore its effect. We conduct experiments on seven Semantic Textual Similarity (STS) tasks. The experimental results show that our ConIsI models based on BERTbase and RoBERTabase achieve state-ofthe-art performance, substantially outperforming previous best models SimCSE-BERTbase and SimCSE-RoBERTabase by 2.05% and 0.77% respectively.”

Tasks

Reproductions