DomBERT: Domain-oriented Language Model for Aspect-based Sentiment Analysis
2020-04-28Findings of the Association for Computational LinguisticsCode Available1· sign in to hype
Hu Xu, Bing Liu, Lei Shu, Philip S. Yu
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/howardhsu/BERT-for-RRC-ABSApytorch★ 465
Abstract
This paper focuses on learning domain-oriented language models driven by end tasks, which aims to combine the worlds of both general-purpose language models (such as ELMo and BERT) and domain-specific language understanding. We propose DomBERT, an extension of BERT to learn from both in-domain corpus and relevant domain corpora. This helps in learning domain language models with low-resources. Experiments are conducted on an assortment of tasks in aspect-based sentiment analysis, demonstrating promising results.