SOTAVerified

Semialgebraic Optimization for Lipschitz Constants of ReLU Networks

2020-02-10NeurIPS 2020Code Available0· sign in to hype

Tong Chen, Jean-Bernard Lasserre, Victor Magron, Edouard Pauwels

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

The Lipschitz constant of a network plays an important role in many applications of deep learning, such as robustness certification and Wasserstein Generative Adversarial Network. We introduce a semidefinite programming hierarchy to estimate the global and local Lipschitz constant of a multiple layer deep neural network. The novelty is to combine a polynomial lifting for ReLU functions derivatives with a weak generalization of Putinar's positivity certificate. This idea could also apply to other, nearly sparse, polynomial optimization problems in machine learning. We empirically demonstrate that our method provides a trade-off with respect to state of the art linear programming approach, and in some cases we obtain better bounds in less time.

Tasks

Reproductions