SOTAVerified

GAdaBoost: Accelerating Adaboost Feature Selection with Genetic Algorithms

2016-09-20Unverified0· sign in to hype

Mai Tolba, Mohamed Moustafa

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Boosted cascade of simple features, by Viola and Jones, is one of the most famous object detection frameworks. However, it suffers from a lengthy training process. This is due to the vast features space and the exhaustive search nature of Adaboost. In this paper we propose GAdaboost: a Genetic Algorithm to accelerate the training procedure through natural feature selection. Specifically, we propose to limit Adaboost search within a subset of the huge feature space, while evolving this subset following a Genetic Algorithm. Experiments demonstrate that our proposed GAdaboost is up to 3.7 times faster than Adaboost. We also demonstrate that the price of this speedup is a mere decrease (3%, 4%) in detection accuracy when tested on FDDB benchmark face detection set, and Caltech Web Faces respectively.

Tasks

Reproductions