CTW: Confident Time-Warping for Time-Series Label-Noise Learning
Authors: Peitian Ma, Zhen Liu, Junhao Zheng, Linghao Wang, Qianli Ma
IJCAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experimental results show that CTW achieves state-of-the-art performance on the UCR datasets when dealing with different types of noise. Besides, the t-SNE visualization of our method verifies that augmenting confident data improves the generalization ability. |
| Researcher Affiliation | Academia | Peitian Ma1 , Zhen Liu1 , Junhao Zheng1 , Linghao Wang1 and Qianli Ma1,2 1School of Computer Science and Engineering, South China University of Technology, Guangzhou, China 2Key Laboratory of Big Data and Intelligent Robot (South China University of Technology), Ministry of Education |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | Our code is available at https://github.com/qianlima-lab/CTW. |
| Open Datasets | Yes | We evaluate our model on publicly available time-series classification datasets from the UCR and UEA repositories [Dau et al., 2019; Bagnall et al., 2018]. |
| Dataset Splits | Yes | We merge the original training and test sets for all time series datasets, then perform five-fold cross-validation, training on four folds and testing on the remaining fold. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running the experiments. |
| Software Dependencies | No | The paper mentions the use of the Adam optimizer but does not provide version numbers for any software dependencies or libraries. |
| Experiment Setup | Yes | We use the Adam optimizer [Kingma and Ba, 2014] with an initial learning rate of 0.001. ... In our model, unless otherwise specified, the corresponding hyperparameters default to: λ = 1, µ = 1, γ = 0.3 and β = 10. For all experiments, the max epoch is set to 300. |