SOTAVerified

Unsupervised Inductive Graph-Level Representation Learning via Graph-Graph Proximity

2019-04-01Code Available0· sign in to hype

Yunsheng Bai, Hao Ding, Yang Qiao, Agustin Marinovic, Ken Gu, Ting Chen, Yizhou Sun, Wei Wang

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We introduce a novel approach to graph-level representation learning, which is to embed an entire graph into a vector space where the embeddings of two graphs preserve their graph-graph proximity. Our approach, UGRAPHEMB, is a general framework that provides a novel means to performing graph-level embedding in a completely unsupervised and inductive manner. The learned neural network can be considered as a function that receives any graph as input, either seen or unseen in the training set, and transforms it into an embedding. A novel graph-level embedding generation mechanism called Multi-Scale Node Attention (MSNA), is proposed. Experiments on five real graph datasets show that UGRAPHEMB achieves competitive accuracy in the tasks of graph classification, similarity ranking, and graph visualization.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
IMDb-MUGraphEmb-FAccuracy50.97Unverified
IMDb-MUGraphEmbAccuracy50.06Unverified
NCI109UGraphEmb-FAccuracy74.48Unverified
NCI109UGraphEmbAccuracy69.17Unverified
PTCUGraphEmbAccuracy72.54Unverified
PTCUGraphEmb-FAccuracy73.56Unverified
REDDIT-MULTI-12KUGraphEmb-FAccuracy41.84Unverified
REDDIT-MULTI-12KUGraphEmbAccuracy39.97Unverified
WebUGraphEmb-FAccuracy45.03Unverified

Reproductions