CatBoost: unbiased boosting with categorical features
Liudmila Prokhorenkova, Gleb Gusev, Aleksandr Vorobev, Anna Veronika Dorogush, Andrey Gulin
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/catboost/catboostOfficialIn papernone★ 0
- github.com/yumoh/catboost_iternone★ 1
- github.com/anantgupta129/CatBoost-in-Python-MLnone★ 0
- github.com/sswetank-CS/MILnone★ 0
- github.com/xiadanqing/Binarynone★ 0
- github.com/vj-thakur/catboostnone★ 0
- github.com/kazeevn/catboostnone★ 0
- github.com/hananlibpost/ml_final_projectnone★ 0
- github.com/sgsonu/SoftBank-Forex-Algorithm-Challengenone★ 0
- github.com/jiangzhongkai/ifly-algorithm_challengenone★ 0
Abstract
This paper presents the key algorithmic techniques behind CatBoost, a new gradient boosting toolkit. Their combination leads to CatBoost outperforming other publicly available boosting implementations in terms of quality on a variety of datasets. Two critical algorithmic advances introduced in CatBoost are the implementation of ordered boosting, a permutation-driven alternative to the classic algorithm, and an innovative algorithm for processing categorical features. Both techniques were created to fight a prediction shift caused by a special kind of target leakage present in all currently existing implementations of gradient boosting algorithms. In this paper, we provide a detailed analysis of this problem and demonstrate that proposed algorithms solve it effectively, leading to excellent empirical results.