SOTAVerified

One vs Previous and Similar Classes Learning -- A Comparative Study

2021-01-05Unverified0· sign in to hype

Daniel Cauchi, Adrian Muscat

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

When dealing with multi-class classification problems, it is common practice to build a model consisting of a series of binary classifiers using a learning paradigm which dictates how the classifiers are built and combined to discriminate between the individual classes. As new data enters the system and the model needs updating, these models would often need to be retrained from scratch. This work proposes three learning paradigms which allow trained models to be updated without the need of retraining from scratch. A comparative analysis is performed to evaluate them against a baseline. Results show that the proposed paradigms are faster than the baseline at updating, with two of them being faster at training from scratch as well, especially on larger datasets, while retaining a comparable classification performance.

Tasks

Reproductions