Member-only story
LightGBM vs XGBoost
Essentials that you need to know about their similarities and differences
Topics covered in this article
Long story short, we will compare LightGBM and XGBoost on the following topics :
- Node splitting
- Tree growing
- Missing data handling
- Categorical feature handling
🚀 Subscribe to us @ newsletter.verticalsolution.io
Quick words on tree boosting
This section talks about how LightGBM and XGBoost implement node splitting and tree growth method in their algorithm.
Tree boosting is an ensemble method that combine predictions from a set of tree models to give a final prediction. Mathematically, it can be expressed as:
where each function f is a tree trained sequentially at each boosting iteration.
Node splitting
At n-th boosting iteration, LightGBM and XGBoost both train the additional tree by enumerating over all features to find the optimal split that has the maximum decrease in loss.