SOTAVerified

Graph Representation Learning

The goal of Graph Representation Learning is to construct a set of features (‘embeddings’) representing the structure of the graph and the data thereon. We can distinguish among Node-wise embeddings, representing each node of the graph, Edge-wise embeddings, representing each edge in the graph, and Graph-wise embeddings representing the graph as a whole.

Source: SIGN: Scalable Inception Graph Neural Networks

Papers

Showing 291300 of 982 papers

TitleStatusHype
A Deep Probabilistic Framework for Continuous Time Dynamic Graph GenerationCode0
GNN-Transformer Cooperative Architecture for Trustworthy Graph Contrastive LearningCode0
Scam Detection for Ethereum Smart Contracts: Leveraging Graph Representation Learning for Secure Blockchain0
A Comparative Study on Dynamic Graph Embedding based on Mamba and Transformers0
Multi-Class and Multi-Task Strategies for Neural Directed Link PredictionCode0
RingFormer: A Ring-Enhanced Graph Transformer for Organic Solar Cell Property PredictionCode0
Bootstrapping Heterogeneous Graph Representation Learning via Large Language Models: A Generalized Approach0
Mixture of Experts Meets Decoupled Message Passing: Towards General and Adaptive Node ClassificationCode0
Why Does Dropping Edges Usually Outperform Adding Edges in Graph Contrastive Learning?Code0
Fine-grained graph representation learning for heterogeneous mobile networks with attentive fusion and contrastive learning0
Show:102550
← PrevPage 30 of 99Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1Pi-net-linearError (mm)0.47Unverified