SOTAVerified

From Cluster Assumption to Graph Convolution: Graph-based Semi-Supervised Learning Revisited

2023-09-24Code Available1· sign in to hype

Zheng Wang, Hongming Ding, Li Pan, Jianhua Li, Zhiguo Gong, Philip S. Yu

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Graph-based semi-supervised learning (GSSL) has long been a hot research topic. Traditional methods are generally shallow learners, based on the cluster assumption. Recently, graph convolutional networks (GCNs) have become the predominant techniques for their promising performance. In this paper, we theoretically discuss the relationship between these two types of methods in a unified optimization framework. One of the most intriguing findings is that, unlike traditional ones, typical GCNs may not jointly consider the graph structure and label information at each layer. Motivated by this, we further propose three simple but powerful graph convolution methods. The first is a supervised method OGC which guides the graph convolution process with labels. The others are two unsupervised methods: GGC and its multi-scale version GGCM, both aiming to preserve the graph structure information during the convolution process. Finally, we conduct extensive experiments to show the effectiveness of our methods.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
CiteSeer with Public Split: fixed 20 nodes per classOGCAccuracy77.5Unverified
CiteSeer with Public Split: fixed 20 nodes per classGGCMAccuracy74.2Unverified
Cora with Public Split: fixed 20 nodes per classOGCAccuracy86.9Unverified
Cora with Public Split: fixed 20 nodes per classGGCMAccuracy83.6Unverified
PubMed with Public Split: fixed 20 nodes per classOGCAccuracy83.4Unverified
PubMed with Public Split: fixed 20 nodes per classGGCMAccuracy80.8Unverified

Reproductions