SOTAVerified

Attention-Free Keyword Spotting

2021-10-14Code Available1· sign in to hype

Mashrur M. Morshed, Ahmad Omar Ahsan

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Till now, attention-based models have been used with great success in the keyword spotting problem domain. However, in light of recent advances in deep learning, the question arises whether self-attention is truly irreplaceable for recognizing speech keywords. We thus explore the usage of gated MLPs --previously shown to be alternatives to transformers in vision tasks-- for the keyword spotting task. We provide a family of highly efficient MLP-based models for keyword spotting, with less than 0.5 million parameters. We show that our approach achieves competitive performance on Google Speech Commands V2-12 and V2-35 benchmarks with much fewer parameters than self-attention-based methods.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
Google Speech CommandsKW-MLPGoogle Speech Commands V2 3597.56Unverified

Reproductions