Functional Gradient Boosting based on Residual Network Perception
Authors: Atsushi Nitanda, Taiji Suzuki
ICML 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental results show superior performance of the proposed method over state-of-the-art methods such as Light GBM. |
| Researcher Affiliation | Academia | 1Graduate School of Information Science and Technology, The University of Tokyo 2Center for Advanced Intelligence Project, RIKEN. |
| Pseudocode | Yes | Algorithm 1 Res FGB [...] Algorithm 2 Sample-splitting Res FGB |
| Open Source Code | No | The paper does not provide any statement or link indicating that the source code for the described methodology is publicly available. |
| Open Datasets | Yes | We use the following benchmark datasets: letter, usps, ijcnn1, mnist, covtype, and susy. |
| Dataset Splits | Yes | For datasets not providing a fixed test set, we first divide each dataset randomly into two parts: 80% for training and the rest for test. We next divide each training set randomly and use 80% for training and the rest for validation. |
| Hardware Specification | No | The paper does not provide specific details about the hardware used for experiments. |
| Software Dependencies | No | The paper mentions using a |
| Experiment Setup | Yes | The number of hidden units in each layer is set to 100 or 1000. Linear classifiers and embeddings are trained by Nesterov s momentum method. The learning rate is chosen from {10 3, 10 2, 10 1, 1}. These parameters and the number of iterations T are tuned based on the performance on the validation set. |