SOTAVerified

pSVM: Soft-margin SVMs with p-norm Hinge Loss

2024-08-19Code Available0· sign in to hype

Haoxiang Sun

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Support Vector Machines (SVMs) based on hinge loss have been extensively discussed and applied to various binary classification tasks. These SVMs achieve a balance between margin maximization and the minimization of slack due to outliers. Although many efforts have been dedicated to enhancing the performance of SVMs with hinge loss, studies on pSVMs, soft-margin SVMs with p-norm hinge loss, remain relatively scarce. In this paper, we explore the properties, performance, and training algorithms of pSVMs. We first derive the generalization bound of pSVMs, then formulate the dual optimization problem, comparing it with the traditional approach. Furthermore, we discuss a generalized version of the Sequential Minimal Optimization (SMO) algorithm, pSMO, to train our pSVM model. Comparative experiments on various datasets, including binary and multi-class classification tasks, demonstrate the effectiveness and advantages of our pSVM model and the pSMO method. Code is available at https://github.com/CoderBak/pSVM.

Tasks

Reproductions