Online normalizer calculation for softmax
2018-05-08Code Available0· sign in to hype
Maxim Milakov, Natalia Gimelshein
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/NVIDIA/online-softmaxOfficialIn papernone★ 0
Abstract
The Softmax function is ubiquitous in machine learning, multiple previous works suggested faster alternatives for it. In this paper we propose a way to compute classical Softmax with fewer memory accesses and hypothesize that this reduction in memory accesses should improve Softmax performance on actual hardware. The benchmarks confirm this hypothesis: Softmax accelerates by up to 1.3x and Softmax+TopK combined and fused by up to 5x.