CLOCS: Contrastive Learning of Cardiac Signals Across Space, Time, and Patients
Authors: Dani Kiyasseh, Tingting Zhu, David A Clifton
ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We conduct our experiments1 using Py Torch (Paszke et al., 2019) on four ECG datasets that include cardiac arrhythmia labels. |
| Researcher Affiliation | Academia | 1Department of Engineering Science, University of Oxford, Oxford, United Kingdom 2Oxford-Suzhou Centre for Advanced Research, Suzhou, China. |
| Pseudocode | No | The paper does not contain any pseudocode or clearly labeled algorithm blocks. |
| Open Source Code | Yes | Code can be accessed at: https://github.com/ danikiyasseh/CLOCS |
| Open Datasets | Yes | We conduct our experiments1 using Py Torch (Paszke et al., 2019) on four ECG datasets that include cardiac arrhythmia labels. Physio Net 2020 (Perez Alday et al., 2020) consists of 12-lead ECG recordings from 6,877 patients alongside 9 different classes of cardiac arrhythmia. Chapman (Zheng et al., 2020) consists of 12-lead ECG recordings from 10,646 patients alongside 11 different classes of cardiac arrhythmia. Physio Net 2017 (Clifford et al., 2017) consists of 8,528 single-lead ECG recordings alongside 4 different classes. Cardiology (Hannun et al., 2019) consists of single-lead ECG recordings from 328 patients alongside 12 different classes of cardiac arrhythmia. |
| Dataset Splits | Yes | All datasets were split into training, validation, and test sets according to patient ID using a 60, 20, 20 configuration. |
| Hardware Specification | No | The paper does not specify the hardware used (e.g., GPU/CPU models, memory) for running experiments. |
| Software Dependencies | No | The paper mentions "Py Torch (Paszke et al., 2019)" but does not specify a version number for PyTorch or any other software dependencies. |
| Experiment Setup | Yes | During self-supervised pre-training, we chose the temperature parameter, τ = 0.1, as per (Chen et al., 2020). For BYOL, we chose the decay rate, τd = 0.90, after experimenting with various alternatives (see Appendix F). For all experiments, we use a neural architecture composed of three 1D convolutional layers followed by two fully connected layers. |