SOTAVerified

Differential Private Stochastic Optimization with Heavy-tailed Data: Towards Optimal Rates

2024-08-19Unverified0· sign in to hype

Puning Zhao, Jiafei Wu, Zhe Liu, Chong Wang, Rongfei Fan, Qingming Li

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We study convex optimization problems under differential privacy (DP). With heavy-tailed gradients, existing works achieve suboptimal rates. The main obstacle is that existing gradient estimators have suboptimal tail properties, resulting in a superfluous factor of d in the union bound. In this paper, we explore algorithms achieving optimal rates of DP optimization with heavy-tailed gradients. Our first method is a simple clipping approach. Under bounded p-th order moments of gradients, with n samples, it achieves O(d/n+d(d/n)^1-1/p) population risk with 1/d. We then propose an iterative updating method, which is more complex but achieves this rate for all 1. The results significantly improve over existing methods. Such improvement relies on a careful treatment of the tail behavior of gradient estimators. Our results match the minimax lower bound in kamath2022improved, indicating that the theoretical limit of stochastic convex optimization under DP is achievable.

Tasks

Reproductions