On Learning Sets of Symmetric Elements
Authors: Haggai Maron, Or Litany, Gal Chechik, Ethan Fetaya
ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section we investigate the effectiveness of DSS layers in practice, by comparing them to previously suggested architectures and different aggregation schemes. |
| Researcher Affiliation | Collaboration | 1NVIDIA Research 2Stanford University 3Bar Ilan University. |
| Pseudocode | No | No structured pseudocode or algorithm blocks were found. |
| Open Source Code | No | The paper does not provide an explicit statement about the release of its source code or a link to a code repository for the described methodology. |
| Open Datasets | Yes | For videos, we used the UCF101 dataset (Soomro et al., 2012). To generate the data, we cropped 7-frame-long sequences from the Dynamic Faust dataset (Bogo et al., 2017) in which the shapes are given as triangular meshes. We generate data for this task from the Places dataset (Zhou et al., 2017), by adding noise and Gaussian blur to each image. This experiment was conducted on two datasets: Celeb A (Liu et al., 2018), and Places (Zhou et al., 2017). Burst deblurring (Imagenet). |
| Dataset Splits | No | The paper does not explicitly provide specific train/validation/test dataset split percentages or sample counts. It mentions using '5 random initializations' but no details on data partitioning. |
| Hardware Specification | Yes | Experiments were conducted using NVIDIA DGX with V100 GPUs. |
| Software Dependencies | No | The paper mentions 'Pytorch' in the references, but it does not specify version numbers for any software dependencies used in the experiments. |
| Experiment Setup | No | The paper describes general architectural components and comparative methods, but it does not specify concrete hyperparameters (e.g., learning rate, batch size, number of epochs) or detailed training configurations for reproducibility. |