Fuzzy Learning Machine

Authors: Junbiao Cui, Jiye Liang

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The systematic experimental results on a large number of data sets show that FLM can achieve excellent performance, even with the simple implementation.
Researcher Affiliation Academia 1 Key Laboratory of Computational Intelligence and Chinese Information Processing of Ministry of Education, School of Computer and Information Technology, Shanxi University, Taiyuan 030006, Shanxi, China.
Pseudocode Yes Algorithm 1: The training process of FLM; Algorithm 2: The test process of FLM
Open Source Code Yes The code of the proposed method is provided in supplementary material.
Open Datasets Yes Experimental settings In this section, the MNIST data set [31] is chosen to demonstrate the working mechanism of NN-FLM. ... 121 benchmark data sets (see Table 2 in Appendix A.4.2). ... And the data sets can be download from the links in Appendix A.4.
Dataset Splits No The paper defines Dtrain and Dtest but does not explicitly mention or specify a validation dataset split.
Hardware Specification No The paper states 'Did you include the total amount of compute and the type of resources used (e.g., type of GPUs, internal cluster, or cloud provider)? [N/A]' and does not provide specific hardware details in the main text.
Software Dependencies No The paper mentions that code is provided in supplementary material but does not list specific software dependencies with version numbers (e.g., Python, PyTorch, TensorFlow versions).
Experiment Setup Yes In NN-FLM, a 5-layer convolutional neural network is used as the feature extraction network, and the fuzzy parameters are fixed as α = 0.2, β = 0.8. ... NN-FLM adopts a 3-layer fully connected network as the feature extraction network. The fuzzy parameters are fixed as α = 0.2, β = 0.8.