Cost-Effective Interactive Attention Learning with Neural Attention Processes
Authors: Jay Heo, Junhyeon Park, Hyewon Jeong, Kwang Joon Kim, Juho Lee, Eunho Yang, Sung Ju Hwang
ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We validate IAL on various time-series datasets from multiple domains (healthcare, real-estate, and computer vision) on which it significantly outperforms baselines with conventional attention mechanisms, or without cost-effective reranking, with substantially less retraining and human-model interaction cost. |
| Researcher Affiliation | Collaboration | 1Korea Advanced Institute of Science and Technology (KAIST), Daejeon, South Korea 2Yonsei University College of Medicine, Seoul, South Korea 3AITRICS, Seoul, South Korea. |
| Pseudocode | Yes | Algorithm 1 Interactive Attention Learning Framework |
| Open Source Code | Yes | The source codes and all datasets used for our experiments are publicly available at https://github.com/jayheo/IAL. |
| Open Datasets | Yes | The source codes and all datasets used for our experiments are publicly available at https://github.com/jayheo/IAL. |
| Dataset Splits | Yes | For all datasets, we generate train/valid/test splits with the ratio of 70%:10%:20%. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU models, CPU types, or memory specifications) used for running its experiments. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers. |
| Experiment Setup | No | The paper states 'Please see supplementary file for more details of the datasets, network configurations, and hyperparameters.' The main text does not include specific hyperparameter values or detailed training configurations. |