SOTAVerified

JavaBERT: Training a transformer-based model for the Java programming language

2021-10-20Code Available0· sign in to hype

Nelson Tavares de Sousa, Wilhelm Hasselbring

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Code quality is and will be a crucial factor while developing new software code, requiring appropriate tools to ensure functional and reliable code. Machine learning techniques are still rarely used for software engineering tools, missing out the potential benefits of its application. Natural language processing has shown the potential to process text data regarding a variety of tasks. We argue, that such models can also show similar benefits for software code processing. In this paper, we investigate how models used for natural language processing can be trained upon software code. We introduce a data retrieval pipeline for software code and train a model upon Java software code. The resulting model, JavaBERT, shows a high accuracy on the masked language modeling task showing its potential for software engineering tools.

Tasks

Reproductions