SOTAVerified

Learning Universal Graph Neural Network Embeddings With Aid Of Transfer Learning

2019-09-22Code Available0· sign in to hype

Saurabh Verma, Zhi-Li Zhang

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Learning powerful data embeddings has become a center piece in machine learning, especially in natural language processing and computer vision domains. The crux of these embeddings is that they are pretrained on huge corpus of data in a unsupervised fashion, sometimes aided with transfer learning. However currently in the graph learning domain, embeddings learned through existing graph neural networks (GNNs) are task dependent and thus cannot be shared across different datasets. In this paper, we present a first powerful and theoretically guaranteed graph neural network that is designed to learn task-independent graph embeddings, thereafter referred to as deep universal graph embedding (DUGNN). Our DUGNN model incorporates a novel graph neural network (as a universal graph encoder) and leverages rich Graph Kernels (as a multi-task graph decoder) for both unsupervised learning and (task-specific) adaptive supervised learning. By learning task-independent graph embeddings across diverse datasets, DUGNN also reaps the benefits of transfer learning. Through extensive experiments and ablation studies, we show that the proposed DUGNN model consistently outperforms both the existing state-of-art GNN models and Graph Kernels by an increased accuracy of 3% - 8% on graph classification benchmark datasets.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
COLLABDUGNNAccuracy84.2Unverified
D&DDUGNNAccuracy82.4Unverified
ENZYMESDUGNNAccuracy67.3Unverified
IMDb-BDUGNNAccuracy78.7Unverified
IMDb-MDUGNNAccuracy56.1Unverified
PROTEINSDUGNNAccuracy81.7Unverified
PTCDUGNNAccuracy74.7Unverified

Reproductions