SOTAVerified

Triplet Interaction Improves Graph Transformers: Accurate Molecular Graph Learning with Triplet Graph Transformers

2024-02-07Code Available2· sign in to hype

Md Shamim Hussain, Mohammed J. Zaki, Dharmashankar Subramanian

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Graph transformers typically lack third-order interactions, limiting their geometric understanding which is crucial for tasks like molecular geometry prediction. We propose the Triplet Graph Transformer (TGT) that enables direct communication between pairs within a 3-tuple of nodes via novel triplet attention and aggregation mechanisms. TGT is applied to molecular property prediction by first predicting interatomic distances from 2D graphs and then using these distances for downstream tasks. A novel three-stage training procedure and stochastic inference further improve training efficiency and model performance. Our model achieves new state-of-the-art (SOTA) results on open challenge benchmarks PCQM4Mv2 and OC20 IS2RE. We also obtain SOTA results on QM9, MOLPCBA, and LIT-PCBA molecular property prediction benchmarks via transfer learning. We also demonstrate the generality of TGT with SOTA results on the traveling salesman problem (TSP).

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
LIT-PCBA(ALDH1)EGT+TGT-At-DPAUC0.81Unverified
LIT-PCBA(KAT2A)EGT+TGT-At-DPAUC0.75Unverified
LIT-PCBA(MAPK1)EGT+TGT-At-DPAUC0.74Unverified

Reproductions