Controlled Differential Equations on Long Sequences via Non-standard Wavelets
Authors: Sourav Pal, Zhanpeng Zeng, Sathya N. Ravi, Vikas Singh
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We present evaluations on a wide variety of experimental settings from prediction to autoencoding to modeling coupled differential equations, to assess the effectiveness and capability of our proposed method. |
| Researcher Affiliation | Academia | 1University of Wisconsin-Madison 2University of Illinois Chicago. Correspondence to: Sourav Pal <spal9@wisc.edu>. |
| Pseudocode | Yes | Algorithm 1 BCR-DE and Algorithm 2 Partially Un-shared Convolution (PUC) |
| Open Source Code | Yes | Code is available at https://github.com/sourav-roni/BCR-DE. |
| Open Datasets | Yes | We evaluate a regression task on data from Beth Israel Deaconess Medical Centre (BIDMC), see (Tan et al., 2020). (Section 6.1.1 (a) Dataset and Setup). We use the Eigen Worms dataset from (Bagnall et al., 2017) (Section 6.1.2 (a) Dataset). |
| Dataset Splits | Yes | In all cases for simulated data, we use 2000 training samples and 1000 samples each for validation and test. (Section 6.3 (a)-(b) Dataset and Setup). Size of training set is 5508, while validation and test set have 1181 samples each. (Appendix B.2). |
| Hardware Specification | No | The paper mentions that 'commodity GPUs were sufficient' but does not provide specific details such as GPU model, CPU, or memory used for running the experiments. |
| Software Dependencies | No | The paper mentions software components like 'Adam', 'scipy odeint', 'RK4', and 'DOPRI5', but it does not specify any version numbers for these software dependencies. |
| Experiment Setup | Yes | The training is performed with learning rate of 0.01, with Adam optimizer and an weight decay of 0.0001. We use learning rate scheduler to reduce learning rate when validation loss plateaus with a patience of 5 and factor of 0.5. (Appendix B.1) |