Robust Lightweight Facial Expression Recognition Network with Label Distribution Training

Authors: Zengqun Zhao, Qingshan Liu, Feng Zhou3510-3519

AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments conducted on realistic occlusion and pose variation datasets demonstrate that the proposed Efficient Face is robust under occlusion and pose variation conditions. Moreover, the proposed method achieves state-of-the-art results on RAF-DB, CAER-S, and Affect Net-7 datasets with accuracies of 88.36%, 85.87%, and 63.70%, respectively, and a comparable result on the Affect Net-8 dataset with an accuracy of 59.89%.
Researcher Affiliation Academia Zengqun Zhao, Qingshan Liu* and Feng Zhou B-DAT Lab, Nanjing University of Information Science & Technology, Nanjing, China {zqzhao, qsliu}@nuist.edu.cn
Pseudocode No No. The paper describes architectural components and the training process in text and refers to Figure 3 for overall structure, but does not include any pseudocode or formal algorithm blocks.
Open Source Code Yes The code and training logs are available at https://github.com/zengqunzhao/Efficient Face.
Open Datasets Yes To verify the effectiveness of the proposed method, we conduct the experiments on three popular in-the-wild facial expression datasets: RAF-DB (Li and Deng 2018), CAER-S (Lee et al. 2019), and Affect Net (Mollahosseini, Hasani, and Mahoor 2017), and five realistic occlusion and pose variation datasets: FED-RO (Li et al. 2019c), Occlusion Affect Net, Occlusion-RAF-DB, Pose-Affect Net and Pose RAF-DB (Wang et al. 2020b).
Dataset Splits No No. The paper provides explicit training and testing sample counts for RAF-DB, CAER-S, and Affect Net, but it does not explicitly provide numerical details for a separate validation split for these main datasets.
Hardware Specification Yes All the models are trained on the NVIDIA Ge Force Titan Xp GPU based on the open-source Py Torch (Paszke et al. 2019) platform.
Software Dependencies No No. The paper mentions 'Py Torch (Paszke et al. 2019) platform' but does not provide a specific version number for PyTorch or any other software dependencies.
Experiment Setup Yes For Efficient Face, parameters were optimized via the SGD optimizer with an initial learning rate of 0.1 and a mini-batch size of 128.