Factor Graph Neural Networks

Authors: Zhen Zhang, Fan Wu, Wee Sun Lee

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments on synthetic and real datasets demonstrate the potential of the proposed architecture.
Researcher Affiliation Academia 1 Australian Institute for Machine Learning & The University of Adelaide, Australia 2 University of Illinois at Urbana-Champaign 3 School of Computing, National University of Singapore
Pseudocode Yes Algorithm 1 The FGNN layer
Open Source Code No The paper does not contain an explicit statement about releasing the source code for its methodology or a link to a code repository.
Open Datasets Yes We train our model on the Human3.6M dataset using the standard training-val-test split as previous works [17, 20, 22]
Dataset Splits Yes We train our model on the Human3.6M dataset using the standard training-val-test split as previous works [17, 20, 22]
Hardware Specification No The paper does not provide specific hardware details such as exact GPU/CPU models, processor types, or memory specifications used for running the experiments.
Software Dependencies No The model is implemented using pytorch [27], but no specific version number for PyTorch or any other software dependency is provided.
Experiment Setup Yes In this task, we use a factor graph neural network consisting of 8 FGNN layers (the details is provided in the supplementary file). The model is implemented using pytorch [27], trained with Adam optimizer [12] with initial learning rate lr = 3 10 3 and after each epoch, lr is decreased by a factor of 0.98.