RealDex: Towards Human-like Grasping for Robotic Dexterous Hand

Authors: Yumeng Liu, Yaxun Yang, Youzhuo Wang, Xiaofei Wu, Jiamin Wang, Yichen Yao, Sören Schwertfeger, Sibei Yang, Wenping Wang, Jingyi Yu, Xuming He, Yuexin Ma

IJCAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments have demonstrated the superior performance of our method on Real Dex and other open datasets.
Researcher Affiliation Academia 1Shanghai Tech University, 2The University of Hong Kong, 3Texas A&M University
Pseudocode No The paper describes methods in text and equations but does not include structured pseudocode or algorithm blocks.
Open Source Code Yes The dataset and associated code are available at https://4dvlab.github.io/Real Dex page/.
Open Datasets Yes The dataset and associated code are available at https://4dvlab.github.io/Real Dex page/. ... We also conduct evaluation on human hand grasping dataset GRAB [Taheri et al., 2020]. ... We evaluate on GRAB [Taheri et al., 2020], Dex Grasp Net [Wang et al., 2023] and our dataset Real Dex.
Dataset Splits Yes We have divided our dataset into training, validation, and test sets, ensuring that each object appears exclusively in one of these subsets. The three sets contain 2114 grasping motion sequences for 40 objects, 245 grasping motions for 6 objects, and 271 grasping motions for 6 objects, respectively.
Hardware Specification Yes Models were both trained and tested on an Ubuntu server, equipped with eight NVIDIA Ge Force RTX 3090 GPU cards.
Software Dependencies No The paper mentions 'Python with Py Torch framework' and 'Gemini [Google, 2023]' but does not provide specific version numbers for these software components.
Experiment Setup No More details for the training process, inference process, loss functions are introduced in the supplementary material.