Human Joint Kinematics Diffusion-Refinement for Stochastic Motion Prediction
Authors: Dong Wei, Huaijiang Sun, Bin Li, Jianfeng Lu, Weiqing Li, Xiaoning Sun, Shengxiang Hu
AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental results on two datasets demonstrate that our model yields the competitive performance in terms of both diversity and accuracy. Extensive experiments show that our model achieves state-of-the-art performance on both Human3.6M and Human Eva-I datasets. |
| Researcher Affiliation | Collaboration | 1School of Computer Science and Engineering, Nanjing University of Science and Technology, Nanjing, China 2Tianjin Ai Forward Science and Technology Co., Ltd., Tianjin, China |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | More visualization results can be found in https://github.com/csdwei/Motion Diff. |
| Open Datasets | Yes | we evaluate our model on two public benchmark datasets including Human3.6M (Ionescu et al. 2013) and Human Eva-I (Sigal, Balan, and Black 2010). |
| Dataset Splits | No | We use 5 subjects (S1, S5, S6, S7, S8) to train the model, and the rest (S9, S11) for evaluation. |
| Hardware Specification | Yes | All the experiments are implemented on an NVIDIA RTX 3080 GPU. |
| Software Dependencies | No | Our code is in Pytorch (Paszke et al. 2017) and we use ADAM (Kingma and Ba 2015) optimizer. |
| Experiment Setup | Yes | For the diffusion network, we use joint embedding layer to upsample the 3D coordinate of human joints from 3 to 32, and then feed it into transformer where the hidden dimension is set to 512. For the refinement network, we use a 12-layers graph convolution network and set the hidden size to 256 in each layer. We set the variance schedules to be β1 = 0.0001 and βK = 0.05, where βk are linearly interpolated (1 < k < K). We train our diffusion model for 1,000 epochs with a batch size of 64 for both datasets. |