SOTAVerified

CryptoGRU: Low Latency Privacy-Preserving Text Analysis With GRU

2020-10-22EMNLP 2021Unverified0· sign in to hype

Bo Feng, Qian Lou, Lei Jiang, Geoffrey C. Fox

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Billions of text analysis requests containing private emails, personal text messages, and sensitive online reviews, are processed by recurrent neural networks (RNNs) deployed on public clouds every day. Although prior secure networks combine homomorphic encryption (HE) and garbled circuit (GC) to preserve users' privacy, naively adopting the HE and GC hybrid technique to implement RNNs suffers from long inference latency due to slow activation functions. In this paper, we present a HE and GC hybrid gated recurrent unit (GRU) network, CryptoGRU, for low-latency secure inferences. CryptoGRU replaces computationally expensive GC-based tanh with fast GC-based ReLU, and then quantizes sigmoid and ReLU with a smaller bit length to accelerate activations in a GRU. We evaluate CryptoGRU with multiple GRU models trained on 4 public datasets. Experimental results show CryptoGRU achieves top-notch accuracy and improves the secure inference latency by up to 138 over one of state-of-the-art secure networks on the Penn Treebank dataset.

Tasks

Reproductions