Self-Supervised Graph Learning for Long-Tailed Cognitive Diagnosis
Authors: Shanshan Wang, Zhen Zeng, Xun Yang, Xingyi Zhang
AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on real-world datasets show the effectiveness of our approach, especially on the students with much sparser interaction records. |
| Researcher Affiliation | Academia | 1Anhui University, He Fei, China 2Institute of Artificial Intelligence, Hefei Comprehensive National Science Center, He Fei, China 3University of Science and Technology of China, He Fei, China |
| Pseudocode | No | The paper describes the methodology in text and uses figures (e.g., Figure 2) to illustrate the model, but it does not include a dedicated pseudocode block or algorithm listing. |
| Open Source Code | Yes | Our code is available at https://github.com/zeng-zhen/SCD. |
| Open Datasets | Yes | We conduct experiments on two real-world datasets: junyi 1 and ASSIST 2. 1https://pslcdatashop.web.cmu.edu/Dataset Info?dataset Id= 1198 2https://sites.google.com/site/assistmentsdata/home/20092010-assistment-data/skill-builder-data-20092010 |
| Dataset Splits | Yes | To explore the effect of different sparse data on the experimental results, we divided the data set into different proportions. train:test 5:5 6:4 7:3 8:2 methods acc rmse acc50 rmse50 acc rmse acc50 rmse50 acc rmse acc50 rmse50 acc rmse acc50 rmse50 (Table 2 header) |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running the experiments. |
| Software Dependencies | No | The paper states "We implement our SCD with Py Torch." but does not specify a version number for PyTorch or any other software dependencies. |
| Experiment Setup | Yes | For each model we set the batch size to 256. As for graph-based models, i.e. RCD and SCD, we set the layers of the graph network to 2. |