SOTAVerified

DenseGNN: universal and scalable deeper graph neural networks for high-performance property prediction in crystals and molecules

2025-01-05Code Available1· sign in to hype

Hongwei Du, Jiamin Wang, Jian Hui, Lanting Zhang, Hong Wang

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Generative models generate vast numbers of hypothetical materials, necessitating fast, accurate models for property prediction. Graph Neural Networks (GNNs) excel in this domain but face challenges like high training costs, domain adaptation issues, and over-smoothing. We introduce DenseGNN, which employs Dense Connectivity Network (DCN), Hierarchical Node-Edge-Graph Residual Networks (HRN), and Local Structure Order Parameters Embedding (LOPE) to address these challenges. DenseGNN achieves state-of-the-art performance on datasets such as JARVIS-DFT, Materials Project, and QM9, improving the performance of models like GIN, Schnet, and Hamnet on materials datasets. By optimizing atomic embeddings and reducing computational costs, DenseGNN enables deeper architectures and surpasses other GNNs in crystal structure distinction, approaching X-ray diffraction method accuracy. This advances materials discovery and design.

Tasks

Reproductions