No Spurious Local Minima: on the Optimization Landscapes of Wide and Deep Neural Networks
2020-09-28Unverified0· sign in to hype
Johannes Lederer
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Empirical studies suggest that wide neural networks are comparably easy to optimize, but mathematical support for this observation is scarce. In this paper, we analyze the optimization landscapes of deep learning with wide networks. We prove especially that constraint and unconstraint empirical-risk minimization over such networks has no spurious local minima. Hence, our theories substantiate the common belief that increasing network widths not only improves the expressiveness of deep-learning pipelines but also facilitates their optimizations.