Comparison of Gradient Boosting Decision Tree Algorithms for CPU Performance

Haithm Haithm, ALSHARI and Abdulrazak Yahya, SALEH and Alper, ODABAŞ (2021) Comparison of Gradient Boosting Decision Tree Algorithms for CPU Performance. Journal of Institue Of Science and Technology, 37 (1). pp. 157-168. ISSN 2467-9240

[img] PDF
abstract comparison.pdf

Download (216kB)
Official URL:


Gradient Boosting Decision Trees (GBDT) algorithms have been proven to be among the best algorithms in machine learning. XGBoost, the most popular GBDT algorithm, has won many competitions on websites like Kaggle. However, XGBoost is not the only GBDT algorithm with state-of-the-art performance. There are other GBDT algorithms that have more advantages than XGBoost and sometimes even more potent like LightGBM and CatBoost. This paper aims to compare the performance of CPU implementation of the top three gradient boosting algorithms. We start by explaining how the three algorithms work and the hyperparameters similarities between them. Then we use a variety of performance criteria to evaluate their performance. We divide the performance criteria into four: accuracy, speed, reliability, and ease of use. The performance of the three algorithms has been tested with five classification and regression problems. Our findings show that the LightGBM algorithm has the best performance of the three with a balanced combination of accuracy, speed, reliability, and ease of use, followed by XGBoost with the histogram method, and CatBoost came last with slow and inconsistent performance.

Item Type: Article
Uncontrolled Keywords: Decision Tree, Gradient Boosting, XGBoost, LightGBM, CatBoost.
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Divisions: Academic Faculties, Institutes and Centres > Faculty of Cognitive Sciences and Human Development
Depositing User: Tekat
Date Deposited: 04 May 2021 06:14
Last Modified: 04 May 2021 06:14

Actions (For repository members only: login required)

View Item View Item