Input-Cell Attention Reduces Vanishing Saliency of Recurrent Neural Networks
Authors: Aya Abdelsalam Ismail, Mohamed Gunady, Luiz Pessoa, Hector Corrada Bravo, Soheil Feizi
NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Using synthetic data, we show that the saliency map produced by the input-cell attention RNN is able to faithfully detect important features regardless of their occurrence in time. We also apply the input-cell attention RNN on a neuroscience task analyzing functional Magnetic Resonance Imaging (f MRI) data for human subjects performing a variety of tasks. |
| Researcher Affiliation | Academia | 1 Department of Computer Science, University of Maryland 2 Department of Psychology, University of Maryland {asalam,mgunady}@cs.umd.edu, pessoa@umd.edu, hcorrada@umiacs.umd.edu, sfeizi@cs.umd.edu |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | Code available at https://github.com/ayaabdelsalam91/Input-Cell-Attention |
| Open Datasets | Yes | We apply input-cell attention to an openly available f MRI dataset of the Human Connectome Project (HCP) [26]. |
| Dataset Splits | No | The paper mentions training and testing but does not provide specific details on validation splits or percentages for any dataset. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running the experiments. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers. |
| Experiment Setup | No | The paper describes the datasets used and the types of experiments conducted (e.g., binary classification, training to convergence) but lacks specific hyperparameter values (e.g., learning rate, batch size, number of epochs) or detailed system-level training settings. |