MBGDT:Robust Mini-Batch Gradient Descent
2022-06-14Code Available0· sign in to hype
Hanming Wang, Haozheng Luo, Yue Wang
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/WHMHammer/robust-mini-batch-gradient-descentOfficialnone★ 1
- github.com/WHMHammer/496-final-projectOfficialnone★ 0
Abstract
In high dimensions, most machine learning method perform fragile even there are a little outliers. To address this, we hope to introduce a new method with the base learner, such as Bayesian regression or stochastic gradient descent to solve the problem of the vulnerability in the model. Because the mini-batch gradient descent allows for a more robust convergence than the batch gradient descent, we work a method with the mini-batch gradient descent, called Mini-Batch Gradient Descent with Trimming (MBGDT). Our method show state-of-art performance and have greater robustness than several baselines when we apply our method in designed dataset.