First-Order Manifold Data Augmentation for Regression Learning
Authors: Ilya Kaufman, Omri Azencot
ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We evaluate FOMA on in-distribution generalization and out-of-distribution robustness benchmarks, and we show that it improves the generalization of several neural architectures. |
| Researcher Affiliation | Academia | 1Department of Computer Science, Ben-Gurion University of the Negev, Beer-Sheva, Israel. |
| Pseudocode | Yes | We provide an example Py Torch pseudocode in Fig. 1 (left). |
| Open Source Code | Yes | Our code is publicly available at https://github.com/ azencot-group/FOMA |
| Open Datasets | Yes | We use the following five datasets to evaluate the performance of in-distribution generalization. Two tabular datasets: Airfoil Self-Noise (Airfoil) (Brooks et al., 2014) and NO2 (Aldrin, 2004). Two time series datasets: Exchange-Rate and Electricity (Lai et al., 2018)... |
| Dataset Splits | Yes | As per reference (Hwang & Whang, 2021), the training, validation, and test sets consist of 1003, 300, and 200 examples, respectively. (Airfoil dataset) |
| Hardware Specification | Yes | The results are obtained with a single RTX3090 GPU. |
| Software Dependencies | No | The paper uses PyTorch as implied by the pseudocode (torch.linalg.svd), but no specific version numbers for PyTorch or other software dependencies are provided in the text. |
| Experiment Setup | Yes | Detailed experimental settings and hyperparameters are provided in App. E. We list the hyperparameters for every dataset in Table 8 and Table 9 for the methods FOMA and FOMAĻ, respectively. |