A Sequential Set Generation Method for Predicting Set-Valued Outputs

Authors: Tian Gao, Jie Chen, Vijil Chenthamarakshan, Michael Witbrock2835-2842

AAAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We perform experiments with both benchmark and synthetic data sets and demonstrate SSG s strong performance over baseline methods.
Researcher Affiliation Industry Tian Gao, Jie Chen, Vijil Chenthamarakshan, Michael Witbrock IBM Research Thomas J. Watson Research Center, Yorktown Heights, NY {tgao, chenjie, ecvijil, witbroc}@us.ibm.com
Pseudocode Yes Algorithm 1 SSG Algorithm Testing Procedure; Algorithm 2 SSG Algorithm Training Procedure; Algorithm 3 SSG-S Algorithm Training Procedure; Algorithm 4 SSG-S Algorithm Testing Procedure
Open Source Code No The paper does not provide any explicit statements about releasing source code or links to a code repository for the described methodology.
Open Datasets Yes YEAST and SCENE dataset, both of which are publicly available.
Dataset Splits No We do a train-test split of 70 30... We generate 1000 samples and randomly split 70% as training and the rest as testing. The paper mentions train-test splits but does not specify a separate validation split.
Hardware Specification No The paper does not provide specific hardware details (e.g., CPU, GPU models, memory) used for running the experiments. It only mentions the use of an encoder-decoder architecture with LSTMs.
Software Dependencies No The paper mentions 'Adam optimizer' but does not specify its version or any other software dependencies with version numbers. It mentions 'one layer LSTM' which is a model component, not a software dependency with a version.
Experiment Setup Yes We use a one layer LSTM with 60 encoder hidden units and 120 decoder hidden units. An embedding layer of size 60 is used for appropriate discrete inputs and outputs. We use Adam optimizer (Kingma and Ba 2014) with a batch size of 15, and cross entropy as loss function.