SOTAVerified

Chemformer: a pre-trained transformer for computational chemistry

2022-01-31Machine Learning: Science and Technology 2022Code Available2· sign in to hype

Ross Irwin, Spyridon Dimitriadis, Jiazhen He, Esben Jannik Bjerrum

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Transformer models coupled with a simplified molecular line entry system (SMILES) have recently proven to be a powerful combination for solving challenges in cheminformatics. These models, however, are often developed specifically for a single application and can be very resource-intensive to train. In this work we present the Chemformer model—a Transformer-based model which can be quickly applied to both sequence-to-sequence and discriminative cheminformatics tasks. Additionally, we show that self-supervised pre-training can improve performance and significantly speed up convergence on downstream tasks. On direct synthesis and retrosynthesis prediction benchmark datasets we publish state-of-the-art results for top-1 accuracy. We also improve on existing approaches for a molecular optimisation task and show that Chemformer can optimise on multiple discriminative tasks simultaneously. Models, datasets and code will be made available after publication.

Tasks

Reproductions