Zero-Shot Learning with Common Sense Knowledge Graphs
Nihal V. Nayak, Stephen H. Bach
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/BatsResearch/zsl-kgOfficialIn paperpytorch★ 113
- github.com/BatsResearch/nayak-arxiv20-codeOfficialIn paperpytorch★ 9
- github.com/batsresearch/nayak-tmlr22-codeOfficialIn paperpytorch★ 9
Abstract
Zero-shot learning relies on semantic class representations such as hand-engineered attributes or learned embeddings to predict classes without any labeled examples. We propose to learn class representations by embedding nodes from common sense knowledge graphs in a vector space. Common sense knowledge graphs are an untapped source of explicit high-level knowledge that requires little human effort to apply to a range of tasks. To capture the knowledge in the graph, we introduce ZSL-KG, a general-purpose framework with a novel transformer graph convolutional network (TrGCN) for generating class representations. Our proposed TrGCN architecture computes non-linear combinations of node neighbourhoods. Our results show that ZSL-KG improves over existing WordNet-based methods on five out of six zero-shot benchmark datasets in language and vision.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| aPY - 0-Shot | ZSL-KG | Top-1 | 60.54 | — | Unverified |
| AwA2 | ZSL-KG | average top-1 classification accuracy | 78.08 | — | Unverified |
| SNIPS | ZSL-KG | Accuracy | 88.98 | — | Unverified |