Electrocardio Panorama: Synthesizing New ECG views with Self-supervision

Authors: Jintai Chen, Xiangshang Zheng, Hongyun Yu, Danny Z. Chen, Jian Wu

IJCAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments verify that our Nef-Net performs well on Electrocardio Panorama synthesis, and outperforms the previous work on the auxiliary tasks (ECG view transformation and ECG synthesis from scratch).
Researcher Affiliation Academia Jintai Chen1 , Xiangshang Zheng1 , Hongyun Yu1 , Danny Z. Chen2 , Jian Wu3 1College of Computer Science and Technology, Zhejiang University, Hangzhou, China 2 Department of Computer Science and Engineering, University of Notre Dame, Notre Dame, USA 3 The First Affiliated Hospital, and Department of Public Health, Zhejiang University School of Medicine, Hangzhou, China jtchen721@gmail.com, {xszheng,yuhongyun777,wujian2000}@zju.edu.cn, dchen@nd.edu
Pseudocode No No explicit pseudocode or algorithm blocks were found in the paper.
Open Source Code Yes The codes and the division labels of cardiac cycles and ECG deflections on Tianchi ECG and PTB datasets are available at https://github. com/WhatAShot/Electrocardio-Panorama.
Open Datasets Yes We conduct experiments using the MIT-BIH dataset [Moody et al., 2001], PTB dataset [Bousseljot et al., 1995], and Tianchi ECG dataset3. The Tianchi dataset provides a link: https://tianchi.aliyun.com/competition/entrance/231754/ information?lang=en-us
Dataset Splits No The paper mentions 'training set' and 'test set' partitions (e.g., 'randomly partitioned into a training set and a test set with probabilities 0.8 and 0.2, respectively' for PTB and Tianchi, and specific sample counts for MIT-BIH), but does not explicitly describe a 'validation set' or its split.
Hardware Specification Yes We report the means and standard deviations over 3 runs with an RTX2080Ti GPU for all the experiments.
Software Dependencies Yes We use Py Torch 1.7.1 to implement Nef-Net.
Experiment Setup Yes In training, the batch size is 32. Nef-Net is run 150 epochs in training. The learning rate is initialized to 0.1, and is reduced by 10 at the 50-th and 100-th epoch. We use SGD as the optimizer with momentum 0.9.