Collaborative Cognitive Diagnosis with Disentangled Representation Learning for Learner Modeling
Authors: Weibo Gao, Qi Liu, Linan Yue, Fangzhou Yao, Hao Wang, Yin Gu, zheng zhang
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments demonstrate the superior performance of Coral, showcasing significant improvements over state-of-the-art methods across several real-world datasets. |
| Researcher Affiliation | Academia | State Key Laboratory of Cognitive Intelligence, University of Science and Technology of China |
| Pseudocode | Yes | Algorithm 1 Coral Model |
| Open Source Code | Yes | Our code is available at https://github.com/bigdata-ustc/Coral. |
| Open Datasets | Yes | We conduct experiments on three real-world datasets: ASSIST [11], Junyi [5] and Neur IPS2020EC [43]. |
| Dataset Splits | Yes | In this setting, we split all the datasets with a 7:1:2 ratio into training sets, validation sets, and test sets. |
| Hardware Specification | Yes | All experiments are conducted on a Linux server equipped with two 3.00GHz Intel Xeon Gold 5317 CPUs and two Tesla A100 40G GPUs. |
| Software Dependencies | No | Each model is implemented by Py Torch [37] and optimized by Adam optimizer [19]. (No version numbers specified for PyTorch or Adam) |
| Experiment Setup | Yes | We set the dimension size d as 20, the layer of graph modeling as 2, and the mini-batch size as 512. In the training stage, we select the learning rate from {0.002, 0.005, 0.01, 0.02, 0.05}, select α from {0.05, 0.1, 0.5, 1} and β from {0.25, 0.5, 1}, and select neighboring number K from {1, 2, 3, 4, 5, 10, 15, 20, 15, 30, 25, 40, 45, 50}. All network parameters are initialized with Xavier initialization [15]. |