Improving Graph Convolutional Networks with Transformer Layer in social-based items recommendation
2024-01-12Code Available0· sign in to hype
Thi Linh Hoang, Tuan Dung Pham, Viet Cuong Ta
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/linhthi/tsOfficialIn paperpytorch★ 5
Abstract
In this work, we have proposed an approach for improving the GCN for predicting ratings in social networks. Our model is expanded from the standard model with several layers of transformer architecture. The main focus of the paper is on the encoder architecture for node embedding in the network. Using the embedding layer from the graph-based convolution layer, the attention mechanism could rearrange the feature space to get a more efficient embedding for the downstream task. The experiments showed that our proposed architecture achieves better performance than GCN on the traditional link prediction task.