SOTAVerified

Ferryman as SemEval-2020 Task 5: Optimized BERT for Detecting Counterfactuals

2020-12-01SEMEVALUnverified0· sign in to hype

Weilong Chen, Yan Zhuang, Peng Wang, Feng Hong, Yan Wang, Yanru Zhang

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

The main purpose of this article is to state the effect of using different methods and models for counterfactual determination and detection of causal knowledge. Nowadays, counterfactual reasoning has been widely used in various fields. In the realm of natural language process(NLP), counterfactual reasoning has huge potential to improve the correctness of a sentence. In the shared Task 5 of detecting counterfactual in SemEval 2020, we pre-process the officially given dataset according to case conversion, extract stem and abbreviation replacement. We use last-5 bidirectional encoder representation from bidirectional encoder representation from transformer (BERT)and term frequency--inverse document frequency (TF-IDF) vectorizer for counterfactual detection. Meanwhile, multi-sample dropout and cross validation are used to improve versatility and prevent problems such as poor generosity caused by overfitting. Finally, our team Ferryman ranked the 8th place in the sub-task 1 of this competition.

Tasks

Reproductions