Sequential Disentanglement by Extracting Static Information From A Single Sequence Element
Authors: Nimrod Berman, Ilan Naiman, Idan Arbiv, Gal Fadlon, Omri Azencot
ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We evaluate our method on multiple data-modality benchmarks including general time series, video, and audio, and we show beyond state-of-the-art results on generation and prediction tasks in comparison to several strong baselines. |
| Researcher Affiliation | Academia | 1Department of Computer Science, Ben Gurion University of the Negev, Beer-Sheva, Israel. |
| Pseudocode | No | The paper describes the architecture and process in text and diagrams (Fig. 1) but does not include any explicit pseudocode blocks or algorithms. |
| Open Source Code | Yes | Code is at Git Hub. |
| Open Datasets | Yes | We follow previous work protocol and we partitioned the dataset into 9000 samples for training and 2664 samples for testing. |
| Dataset Splits | Yes | The data is split into train, validation, and test sets, with a 12/4/4-month split ratio. |
| Hardware Specification | No | The paper does not mention any specific hardware (e.g., GPU models, CPU models, or cloud computing instances with specifications) used for running the experiments. |
| Software Dependencies | No | The paper describes its implementation using components like LSTM, MLPs, and refers to optimizers like Adam, but it does not specify any software names with version numbers (e.g., Python, PyTorch, TensorFlow versions). |
| Experiment Setup | Yes | A comprehensive summary of these optimal hyper-parameters for each task and dataset is available in Tab. 6, and all training processes were limited to a maximum of 2000 epochs. |