Power Consumption Variation over Activation Functions
2020-06-12Code Available0· sign in to hype
Leon Derczynski
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/leondz/inferencepowerOfficialIn paperpytorch★ 2
Abstract
The power that machine learning models consume when making predictions can be affected by a model's architecture. This paper presents various estimates of power consumption for a range of different activation functions, a core factor in neural network model architecture design. Substantial differences in hardware performance exist between activation functions. This difference informs how power consumption in machine learning models can be reduced.