Learning to Binarize Continuous Features for Neuro-Rule Networks

Authors: Wei Zhang, Yongxiang Liu, Zhuo Wang, Jianyong Wang

IJCAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We conduct comprehensive experiments on public datasets and demonstrate the effectiveness of Auto Int in boosting the performance of NRNs. 4 Experiments 4.1 Experimental Setup
Researcher Affiliation Academia School of Computer Science and Technology, East China Normal University; Shanghai Institute for AI Education; Department of Computer Science and Technology, Tsinghua University
Pseudocode No The paper describes computational procedures (e.g., Equation 1, 3, 4, 6, 7) and provides a high-level overview of Auto Int's modules, but it does not include a formally labeled 'Pseudocode' or 'Algorithm' block.
Open Source Code Yes The source code of Auto Int is available at https://github. com/yxliu99/Auto Int.
Open Datasets Yes We use six public datasets from the UCI dataset repository1. All of them satisfy to contain some continuous features for testing the effectiveness of binarization methods. Table 1 summarizes the statistics of the 6 datasets. 1https://archive.ics.uci.edu/ml/datasets.php
Dataset Splits Yes To have a reliable performance evaluation, we adopt 5-fold cross-validation and report the average performance, the same as [Wang et al., 2021]. When there are hyperparameters needing to be tuned, we use 80% of the training set for optimization and the left for validation.
Hardware Specification No The paper does not provide any specific hardware details such as GPU models, CPU types, or memory specifications used for the experiments.
Software Dependencies No The paper does not provide specific software dependencies with version numbers (e.g., 'PyTorch 1.9', 'Python 3.8'). It only states 'The source code of Auto Int is available at https://github. com/yxliu99/Auto Int.', which might imply dependencies are within the repo, but they are not explicitly listed in the text.
Experiment Setup Yes For the proposed Auto Int, both the temperature τ1 and τ2 are selected in {500, 1000, 2000}, K is searched in {5, 10, 15, 20, 30, 50}, and λ is tuned in {0.005, 0.01, 0.05, 0.1, 0.2, 0.5}. The intervals are initialized by Fre Int, if not otherwise stated. Besides, all the continuous feature values are pre-processed through standard normalization. For the neuro-rule network RRL and XGBoost (XGBoost directly uses the continuous features as input), we also follow [Wang et al., 2021] to set their hyper-parameters.