Domain Adaptation for Time Series Under Feature and Label Shifts

Authors: Huan He, Owen Queen, Teddy Koker, Consuelo Cuevas, Theodoros Tsiligkaridis, Marinka Zitnik

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our experiments with 5 datasets and 13 state-of-the-art UDA methods demonstrate that RAINCOAT can improve transfer learning performance by up to 16.33% and can handle both closed-set and universal domain adaptation.
Researcher Affiliation Academia 1Department of Biomedical Informatics, Harvard University 2Artificial Intelligence Technology, MIT Lincoln Laboratory.
Pseudocode Yes Algorithm 1 Overview of RAINCOAT; Algorithm 2 Simplified illustration of computation of Sinkhorn Divergence (Sinkhorn, 1964); Algorithm 3 Time-Frequency Feature Encoder and Decoder, Domain Alignment via Sinkhorn Divergence; Algorithm 4 Detailed overview of RAINCOAT.
Open Source Code Yes RAINCOAT is available at https://github.com/mims-harvard/Raincoat.
Open Datasets Yes We consider five benchmark datasets from three distinct problem types: (1) human activity recognition: WISDM (Kwapisz et al., 2011), HAR (Anguita et al., 2013), HHAR (Stisen et al., 2015); (2) mechanical fault detection: Boiler (Shohet et al., 2019); and (3) EEG prediction: Sleep EDF (Goldberger et al., 2000).
Dataset Splits Yes We report accuracy and macro-F1 calculated using target test datasets. ... For the extraction of time-space features, we utilized a 1D-convolutional neural network (CNN) as the encoder. ... The implementation of the 1D-CNN architecture was adapted from a recently published benchmark codebase in the literature (Ragab et al., 2022), which has also been employed by others (Ozyurt et al., 2022). ... The hyperparameters of Adam were selected after conducting a grid search on source validation datasets, exploring a range of learning rates from 1e-4 to 1e-1.
Hardware Specification Yes The experiments were conducted on a NVIDIA GeForce RTX 3090 graphics card.
Software Dependencies No The implementation was done in Py Torch, based on the code available at here. However, specific version numbers for PyTorch or other software dependencies are not provided.
Experiment Setup Yes Key hyperparameters for RAINCOAT are reported in Tables 5, 6, 7, 8, and 9. The Fourier Frequency modes used for HAR, EEG, HHAR, WISDM, and Boiler datasets are 64, 200, 64, 64, and 10, respectively. For the regularization term used in the Sinkhorn divergence, we consistently used a value of 1e-3 across all datasets and experiments.