SOTAVerified

Graph Representation Learning

The goal of Graph Representation Learning is to construct a set of features (‘embeddings’) representing the structure of the graph and the data thereon. We can distinguish among Node-wise embeddings, representing each node of the graph, Edge-wise embeddings, representing each edge in the graph, and Graph-wise embeddings representing the graph as a whole.

Source: SIGN: Scalable Inception Graph Neural Networks

Papers

Showing 301310 of 982 papers

TitleStatusHype
SignGT: Signed Attention-based Graph Transformer for Graph Representation Learning0
Self-Pro: A Self-Prompt and Tuning Framework for Graph Neural NetworksCode0
MAGIC: Detecting Advanced Persistent Threats via Masked Graph Representation LearningCode1
SGA: A Graph Augmentation Method for Signed Graph Neural Networks0
Topology-guided Hypergraph Transformer Network: Unveiling Structural Insights for Improved Representation0
Does Graph Distillation See Like Vision Dataset Counterpart?Code1
An Edge-Aware Graph Autoencoder Trained on Scale-Imbalanced Data for Traveling Salesman Problems0
Certifiably Robust Graph Contrastive LearningCode1
Audio Event-Relational Graph Representation Learning for Acoustic Scene ClassificationCode1
Transformers are efficient hierarchical chemical graph learnersCode0
Show:102550
← PrevPage 31 of 99Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1Pi-net-linearError (mm)0.47Unverified