HittER: Hierarchical Transformers for Knowledge Graph Embeddings
Sanxing Chen, Xiaodong Liu, Jianfeng Gao, Jian Jiao, Ruofei Zhang, Yangfeng Ji
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/zjunlp/relphormerpytorch★ 140
- github.com/microsoft/HittERpytorch★ 83
- github.com/seeyourmind/tkgelibpytorch★ 18
Abstract
This paper examines the challenging problem of learning representations of entities and relations in a complex multi-relational knowledge graph. We propose HittER, a Hierarchical Transformer model to jointly learn Entity-relation composition and Relational contextualization based on a source entity's neighborhood. Our proposed model consists of two different Transformer blocks: the bottom block extracts features of each entity-relation pair in the local neighborhood of the source entity and the top block aggregates the relational information from outputs of the bottom block. We further design a masked entity prediction task to balance information from the relational context and the source entity itself. Experimental results show that HittER achieves new state-of-the-art results on multiple link prediction datasets. We additionally propose a simple approach to integrate HittER into BERT and demonstrate its effectiveness on two Freebase factoid question answering datasets.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| FB15k-237 | HittER | Hits@3 | 0.41 | — | Unverified |
| WN18RR | HittER | Hits@10 | 0.58 | — | Unverified |