CatBoost: unbiased boosting with categorical features

Authors: Liudmila Prokhorenkova, Gleb Gusev, Aleksandr Vorobev, Anna Veronika Dorogush, Andrey Gulin

NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirical results demonstrate that Cat Boost outperforms leading GBDT packages and leads to new state-of-the-art results on common benchmarks.
Researcher Affiliation Collaboration 1Yandex, Moscow, Russia 2Moscow Institute of Physics and Technology, Dolgoprudny, Russia
Pseudocode Yes Algorithm 1: Ordered boosting" and "Algorithm 2: Building a tree in Cat Boost
Open Source Code Yes Their combination is implemented as an open-source library1 called Cat Boost (for Categorical Boosting ), which outperforms the existing state-of-the-art implementations of gradient boosted decision trees XGBoost [8] and Light GBM [16] on a diverse set of popular machine learning tasks (see Section 6). 1https://github.com/catboost/catboost
Open Datasets Yes We compare our algorithm with the most popular open-source libraries XGBoost and Light GBM on several well-known machine learning tasks. The detailed description of the experimental setup together with dataset descriptions is available in the supplementary material (Section D).
Dataset Splits No The parameter tuning and training were performed on 4/5 of the data and the testing was performed on the remaining 1/5.
Hardware Specification No The paper does not provide specific hardware details (exact GPU/CPU models, processor types with speeds, memory amounts, or detailed computer specifications) used for running its experiments in the main text.
Software Dependencies No The paper mentions using 'XGBoost' and 'Light GBM' but does not specify version numbers for these or any other software dependencies.
Experiment Setup No The detailed description of the experimental setup together with dataset descriptions is available in the supplementary material (Section D).