SOTAVerified

TEMPLATE: TempRel Classification Model Trained with Embedded Temporal Relation Knowledge

2022-01-16ACL ARR January 2022Unverified0· sign in to hype

Anonymous

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In recent years, the mainstream Temporal Relation (TempRel) classification methods may not take advantage of the large amount of semantic information contained in golden TempRel labels which is lost by the traditional discrete one-hot labels. So we propose a new approach that can make full use of golden TempRel label information and make the model performance better. Firstly we build a TempRel Classification model which consists of a RoBERTa and a Classifier. Secondly we establish fine-grained templates to automatically generate sentences to enrich golden TempRel label information and build an Enhanced Data-set. Thirdly we use the Enhanced Data-set to train the Knowledge Encoder which has the same structure as the TempRel Classification model, and get embedded knowledge. Finally we Trian the TempRel Classification model with EMbedded temPoral reLATion knowldgE (TEMPLATE) by using our designed Cosine balanced MSE loss function. Extensive experimental results shows that our approach achieves new state-of-the-art results on TB-Dense and MATRES and outperforms the TempRel Classification model trained with only traditional cross entropy loss function with up to 5.51%F1 on TB-Dense and 2.02%F1 on MATRES.

Tasks

Reproductions