Heterogeneous Region Embedding with Prompt Learning
Authors: Silin Zhou, Dan He, Lisi Chen, Shuo Shang, Peng Han
AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our experiment results on real-world datasets demonstrate that our proposed model outperforms state-of-the-art methods. |
| Researcher Affiliation | Academia | 1 University of Electronic Science and Technology of China 2 The University of Queensland, Australia |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | The details of implementation can be viewed on this URL 1. 1https://github.com/slzhou-xy/HREP |
| Open Datasets | Yes | We collect a variety of real-world data from NYC Open Data 2 specific for the Manhattan, New York area, where Taxi trips are used as human mobility. 2https://opendata.cityofnewyork.us |
| Dataset Splits | No | The paper describes the datasets used (Table 1) but does not provide specific details on train/validation/test splits, percentages, or absolute sample counts for data partitioning. |
| Hardware Specification | Yes | Our model is implemented with Py Torch on an Nvidia RTX3090 GPU. |
| Software Dependencies | No | The paper mentions 'Py Torch' but does not specify its version or provide version numbers for any other software dependencies. |
| Experiment Setup | Yes | The dimension of our model is 144. The dimension of prompt embedding is also set as 144. The layer of relation-aware GCN is set as 3. In the multi-head self-attention, we set the number of heads as 4. We adopt Adam to optimize our model, including HRE module and prompt learning module, and both learning rates are set as 0.001. The epoch is set as 2000 in HRE module and 6000 in prompt learning. Moreover, we set the value of k as 10. |