Conditional Information Bottleneck Approach for Time Series Imputation

Authors: MinGyu Choi, Changhee Lee

ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our experiments, conducted on multiple real-world datasets, consistently demonstrate that our method significantly improves imputation performance (including both interpolation and extrapolation), and also enhances prediction performance based on the imputed values.
Researcher Affiliation Academia Min Gyu Choi Massachusetts Institute of Technology, USA chemgyu@mit.edu Changhee Lee Chung-Ang University, Korea changheelee@cau.ac.kr
Pseudocode Yes We provide the pseudo-algorithm of Time CIB as below: Algorithm 1 Conditional Information Bottleneck on Time Series
Open Source Code Yes CODE AVAILABILITY Codebase used in this paper is available at https://github.com/Chemgyu/Time CIB.
Open Datasets Yes Healing MNIST (Krishnan et al., 2015) has approximately 60% of missing pixels under a missingnot-at-random (MNAR) pattern on every time step... Rotated MNIST (Ramchandran et al., 2021) evaluates performance on interpolation and extrapolation... Beijing Air Quality (Zhang et al., 2017) and US Local5 https://www.ncei.noaa.gov/data/local-climatological-data/ whose time series measurements are collected every hour... Physionet2012 Mortality Prediction Challenge (Silva et al., 2012)
Dataset Splits Yes Table D1: Data Statistics. # Samples Len (T ) Feature Dim # Classes Missing Ratio (ori/art) Healing MNIST 50000/10000/10000 10 28 28 1 10 /60% Rotated MNIST 50000/10000/10000 10 28 28 1 10 /60%
Hardware Specification Yes All experiments were conducted using a 48GB NVIDIA RTX A6000.
Software Dependencies No The paper does not mention specific software dependencies with version numbers, such as Python, PyTorch, or TensorFlow versions, only general implementation details.
Experiment Setup Yes Table C1: Hyperparameter specifications. Hidden Dim Batch Size Epochs Learning Rate Temperature Kernel parameter Healing MNIST 128 64 30 1e 3 1.0 2.0 Rotated MNIST 128 64 30 1e 3 1.0 2.0 Beijing 128 64 100 1e 3 1.0 4.0 US Local 64 16 20 1e 4 1.0 2.0 Physionet2012 16 256 50 1e 3 1.0 32