Progressive Feature Interaction Search for Deep Sparse Network
Authors: Chen Gao, Yinfeng Li, Quanming Yao, Depeng Jin, Yong Li
NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments on three real-world benchmark datasets show promising results of PROFIT in both accuracy and efficiency. |
| Researcher Affiliation | Collaboration | 1Beijing National Research Center for Information Science and Technology, Department of Electronic Engineering, Tsinghua University 24Paradigm Inc. |
| Pseudocode | Yes | Algorithm 1 Progressive gradient descent. |
| Open Source Code | No | The paper provides links to baseline implementations but does not provide concrete access to its own source code, nor does it explicitly state that its own code will be released. |
| Open Datasets | Yes | To validate the effectiveness of our proposed PROFIT, we conduct experiments on three benchmark datasets (Criteo, Avazu and ML1M) widely used in existing works of deep sparse networks [5, 31, 18] to evaluate the performance, of which the details are provided in the Appendix. |
| Dataset Splits | Yes | where Dtra and Dval denote the training and validation datasets, respectively. All the other hyper-parameters are tuned on the validation set. |
| Hardware Specification | No | The paper mentions 'our normal hardware platform' but does not provide specific details such as GPU models, CPU types, or memory amounts used for running experiments. |
| Software Dependencies | No | The paper states 'We implement our methods using Py Torch' but does not specify the version number for PyTorch or any other software dependencies. |
| Experiment Setup | Yes | We apply Adam with a learning rate of 0.001 and a mini-batch size of 4096, a widely-used setting in existing works [5, 31]. We set the embedding sizes to 16 in all the models. We use the same neural network structure ({400, 400, 400}) for all methods that adopt MLP for a fair comparison, following [5, 31]. |