SOTAVerified

SelectFormer: Private and Practical Data Selection for Transformers

2023-10-03Unverified0· sign in to hype

Xu Ouyang, Felix Xiaozhu Lin, Yangfeng Ji

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Critical to a free data market is private data selection, i.e. the model owner selects and then appraises training data from the data owner before both parties commit to a transaction. To keep the data and model private, this process shall evaluate the target model to be trained over Multi-Party Computation (MPC). While prior work suggests that evaluating Transformer-based models over MPC is prohibitively expensive, this paper makes it practical for the purpose of data selection. Our contributions are three: (1) a new pipeline for private data selection over MPC; (2) emulating high-dimensional nonlinear operators with low-dimension MLPs, which are trained on a small sample of the data of interest; (3) scheduling MPC in a parallel, multiphase fashion. We evaluate our method on diverse Transformer models and NLP/CV benchmarks. Compared to directly evaluating the target model over MPC, our method reduces the delay from thousands of hours to tens of hours, while only seeing around 0.20% accuracy degradation from training with the selected data.

Tasks

Reproductions