SOTAVerified

TAdam: A Robust Stochastic Gradient Optimizer

2020-02-29Code Available0· sign in to hype

Wendyam Eric Lionel Ilboudo, Taisuke Kobayashi, Kenji Sugimoto

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Machine learning algorithms aim to find patterns from observations, which may include some noise, especially in robotics domain. To perform well even with such noise, we expect them to be able to detect outliers and discard them when needed. We therefore propose a new stochastic gradient optimization method, whose robustness is directly built in the algorithm, using the robust student-t distribution as its core idea. Adam, the popular optimization method, is modified with our method and the resultant optimizer, so-called TAdam, is shown to effectively outperform Adam in terms of robustness against noise on diverse task, ranging from regression and classification to reinforcement learning problems. The implementation of our algorithm can be found at https://github.com/Mahoumaru/TAdam.git

Tasks

Reproductions