SOTAVerified

ConCoDE: Hard-constrained Differentiable Co-Exploration Method for Neural Architectures and Hardware Accelerators

2021-09-29Unverified0· sign in to hype

Deokki Hong, Kanghyun Choi, Hey Yoon Lee, Joonsang Yu, Youngsok Kim, Noseong Park, Jinho Lee

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

While DNNs achieve over-human performances in a number of areas, it is often accompanied by the skyrocketing computational costs. Co-exploration of an optimal neural architecture and its hardware accelerator is an approach of rising interest which addresses the computational cost problem, especially in low-profile systems (e.g., embedded, mobile). The difficulty of having to search the large co-exploration space is often addressed by adopting the idea of differentiable neural architecture search. Despite the superior search efficiency of the differentiable co-exploration, it faces a critical challenge of not being able to systematically satisfy hard constraints, such as frame rate or power budget. To handle the hard constraint problem of differentiable co-exploration, we propose ConCoDE, which searches for hard-constrained solutions without compromising the global design objectives. By manipulating the gradients in the interest of the given hard constraint, high-quality solutions satisfying the constraint can be obtained. Experimental results show that ConCoDE is able to meet the constraints even in tight conditions. We also show that the solutions searched by ConCoDE exhibit high quality compared to those searched without any constraint.

Tasks

Reproductions