SOTAVerified

Automatic Generation of Neural Architecture Search Spaces

2021-11-21AAAI Workshop CLeaR 2022Code Available0· sign in to hype

David Calhas, Vasco M. Manquinho, Ines Lynce

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Neural Architecture Search (NAS) is receiving growing attention as the need to remove the human bias from neural network models rises. There is extensive research in trying to beat state-of-the-art NAS algorithms. However, these advances do not focus directly on the search space these algorithms explore. Here, we propose a framework that encodes the structure of a convolutional neural network, respecting the arithmetical relation of the kernel and stride sizes with the input and output shapes. This framework consists of a formula with constraints that, if given the structure of the problem (input and output shapes), can produce specification properties of a neural architecture through a solver. We show that this methodology can assemble networks with arbitrary sizes and structures, that make for unique and uniform search spaces. To compare the resulting architectures, a metric that computes dissimilarity in terms of architectural structure is proposed. We empirically show that generating dissimilar architectures, implies dissimilarities in performance. In accordance, similar architectures are similar in performance.

Tasks

Reproductions