SOTAVerified

The BUTTER Zone: An Empirical Study of Training Dynamics in Fully Connected Neural Networks

2022-07-25Code Available0· sign in to hype

Charles Edison Tripp, Jordan Perr-Sauer, Lucas Hayne, Monte Lunacek, Jamil Gafur

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We present an empirical dataset surveying the deep learning phenomenon on fully-connected feed-forward multilayer perceptron neural networks. The dataset, which is now freely available online, records the per-epoch training and generalization performance of 483 thousand distinct hyperparameter choices of architectures, tasks, depths, network sizes (number of parameters), learning rates, batch sizes, and regularization penalties. Repeating each experiment an average of 24 times resulted in 11 million total training runs and 40 billion epochs recorded. Accumulating this 1.7 TB dataset utilized 11 thousand CPU core-years, 72.3 GPU-years, and 163 node-years. In surveying the dataset, we observe durable patterns persisting across tasks and topologies. We aim to spark scientific study of machine learning techniques as a catalyst for the theoretical discoveries needed to progress the field beyond energy-intensive and heuristic practices.

Tasks

Reproductions