Learning Memory Access Patterns
Authors: Milad Hashemi, Kevin Swersky, Jamie Smith, Grant Ayers, Heiner Litz, Jichuan Chang, Christos Kozyrakis, Parthasarathy Ranganathan
ICML 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | On a suite of challenging benchmark datasets, we find that neural networks consistently demonstrate superior performance in terms of precision and recall. This work represents the first step towards practical neural-network based prefetching, and opens a wide range of exciting directions for machine learning in computer architecture research. |
| Researcher Affiliation | Collaboration | 1Google 2Stanford University 3University of California, Santa Cruz. |
| Pseudocode | No | The paper describes the LSTM process using mathematical equations but does not include any structured pseudocode or algorithm blocks for the overall prefetching methodology. |
| Open Source Code | No | The paper does not contain any explicit statement about releasing source code or provide a link to a code repository. |
| Open Datasets | Yes | To evaluate our proposals, we use the memory intensive applications of SPEC CPU2006. This is a standard benchmark suite that is used pervasively to evaluate the performance of computer systems. However, SPEC CPU2006 also has small working sets when compared to modern datacenter workloads. Therefore in addition to SPEC benchmarks, we also include Google s websearch workload. |
| Dataset Splits | No | The paper states, 'We split each trace into a training and testing set, using 70% for training and 30% for evaluation,' but does not explicitly mention a separate validation set. |
| Hardware Specification | No | The paper mentions 'a simple cache simulator that emulates an Intel Broadwell microprocessor' for data collection, but does not specify the hardware used to train or run the LSTM models for the experiments. |
| Software Dependencies | No | The paper mentions tools and optimizers like 'Pin', 'ADAM', and 'Adagrad', but does not provide specific version numbers for these or other software dependencies. |
| Experiment Setup | Yes | We report the specific hyperparameters used in the appendix. |