Learning to select for a predefined ranking
Authors: Aleksei Ustimenko, Aleksandr Vorobev, Gleb Gusev, Pavel Serdyukov
ICML 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our offline and online experiments with a large-scale product search engine demonstrate the overwhelming advantage of our methods over the baselines in terms of all key quality metrics. |
| Researcher Affiliation | Collaboration | 1Yandex, Moscow, Russia 2Skoltech University, Moscow, Russia 3Faculty of Computer Science, Higher School of Economics, Moscow, Russia 4Department of Innovation and High Technology, Moscow Institute of Physics and Technology, Dolgoprudny, Russia. |
| Pseudocode | No | The paper refers to 'Algorithm 3 in (Prokhorenkova et al., 2018)' but does not contain its own pseudocode or algorithm blocks. |
| Open Source Code | Yes | The whole source code of our learning algorithm and its difference from Cat Boost release 0.10.04 are available10. Later, it was added to Cat Boost as Stochastic Filter loss11. 10https://github.com/Take Over/catboost/tree/0.10.4_release 11https://github.com/catboost/catboost/commit/df18d16 |
| Open Datasets | Yes | The data with labels used for learning all the models and for evaluation by DCG-RR (see description of labels and metrics below) is available in open source 7. 7https://research.yandex.com/datasets/market |
| Dataset Splits | Yes | we randomly split all the queries from the collected dataset D into 5 parts of equal size to run 5-fold cross-validation: at each i of five runs, 80% of the queries (Di train) are used for training, 10% (Di valid) are used for tuning hyperparameters of the algorithms and 10% (Di test) are used for testing. |
| Hardware Specification | No | The paper does not provide specific details about the hardware used for experiments. |
| Software Dependencies | Yes | We use GBDT implementation in opensourced Cat Boost9 Python package... The whole source code of our learning algorithm and its difference from Cat Boost release 0.10.04 are available10. |
| Experiment Setup | No | See Section 4 of the supplementary for Cat Boost parameters we used. |