An Exploration of Prompt-Based Zero-Shot Relation Extraction Method
Anonymous
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Zero-shot relation extraction is an important method for dealing with the newly emerging relations in the real world which lacks labeled data. However, the current zero-shot methods usually rely on large-scale and in-domain labeled data of predefined relations. In this work, we view zero-shot relation extraction as a semantic match task optimized by prompt-tuning, which maintains a high generalization ability when using even one labeled data in per predefined relations. Specifically, we reduce the dependence on labeled data of predefined relation with the help of the existing knowledge in the pretrained language model (PLMs). To induce this knowledge, we fuse the original input with the prompt template to formulate a cloze-style task, which is consistent with the pretraining stage. The induced knowledge facilitates new relation discovery when large-scale labeled data are not available. Experiment results on two academic datasets show that our method achieves similar performance with only 0.05\% labeled data of pre-defined relations, compared with current state-of-the-art method. Using all labeled data, our method improves the F1 score by nearly 30\% and 9\% on the two datasets respectively.