A Walsh Hadamard Derived Linear Vector Symbolic Architecture
Authors: Mohammad Mahmudul Alam, Alexander Oberle, Edward Raff, Stella Biderman, Tim Oates, James Holt
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | section 4 will empirically evaluate HLB in classical VSA benchmark tasks, and in two recent deep learning tasks, showing improved performance in each scenario. |
| Researcher Affiliation | Collaboration | 1University of Maryland, Baltimore County,2Booz Allen Hamilton, 3 Laboratory for Physical Sciences |
| Pseudocode | No | No pseudocode or algorithm blocks were explicitly labeled or presented in a structured format. |
| Open Source Code | Yes | Code is available at https://github.com/Future Computing4AI/ Hadamard-derived-Linear-Binding. |
| Open Datasets | Yes | CSPS experimented with 5 datasets MNIST, SVHN, CIFAR-10 (CR10), CIFAR-100 (CR100), and Mini-Image Net (MIN). ... The network is trained on 8 datasets listed in Table 4 from [4] |
| Dataset Splits | Yes | Other than changing the VSA used, we follow the same training, testing, architecture size, and validation procedure of [3]. |
| Hardware Specification | Yes | All the experiments are performed on a single NVIDIA TESLA PH402 GPU with 32GB memory. |
| Software Dependencies | No | The paper mentions 'The Torch HD library [15] is used for implementations of prior methods.' but does not provide specific version numbers for any software dependencies. |
| Experiment Setup | Yes | Other than changing the VSA used, we follow the same training, testing, architecture size, and validation procedure of [3]. |