Boosting Neural Cognitive Diagnosis with Student’s Affective State Modeling
Authors: Shanshan Wang, Zhen Zeng, Xun Yang, Ke Xu, Xingyi Zhang
AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on real-world datasets clearly demonstrate the effectiveness of our ACD. |
| Researcher Affiliation | Academia | 1Information Materials and Intelligent Sensing Laboratory of Anhui Province, Anhui University, He Fei, China 2Institutes of Physical Science and Information Technology, Anhui University, He Fei, China 3University of Science and Technology of China, He Fei, China |
| Pseudocode | No | The paper describes the model with equations and textual descriptions but does not include structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | Our code is available at https://github.com/zengzhen/ACD. |
| Open Datasets | Yes | ASSIST17 is collected for ASSISTments Longitudinal Data Mining Competition in 2017. ASSIST12 is the data for the school year 2012-2013 with affect. junyi is collected from the Chinese e-learning website Junyi Academy (Chang, Hsu, and Chen 2015). 80% data of each student is leveraged for training and the remaining 20% for testing. |
| Dataset Splits | Yes | 80% data of each student is leveraged for training and the remaining 20% for testing. |
| Hardware Specification | Yes | The experiments are conducted on the Intel Core i9-10900X CPU and a Ge Force RTX 3090 GPU. |
| Software Dependencies | No | The paper mentions "Py Torch" as an implementation tool but does not provide a specific version number for it or other software dependencies. |
| Experiment Setup | Yes | We set the coefficient λ for the affect prediction loss in Eq. 14 to 1 to avoid over adjusting. and the dimension of the affect latent traits for students, denoted as d, to 128 for good performance. The network parameters are initialized using Xavier initialization following the (Wang et al. 2020). All the weights are sampled from N 0, 2 nin+nout , where nin refers to the input dimensionality of a layer, and nout refers to the output dimensionality. |