KGTS: Contrastive Trajectory Similarity Learning over Prompt Knowledge Graph Embedding
Authors: Zhen Chen, Dalin Zhang, Shanshan Feng, Kaixuan Chen, Lisi Chen, Peng Han, Shuo Shang
AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on two real-world trajectory datasets demonstrate the superior performance of KGTS over stateof-the-art methods. |
| Researcher Affiliation | Collaboration | 1University of Electronic Science and Technology of China 2Aalborg University, Denmark 3Centre for Frontier AI Research, A*STAR, Singapore 4Institute of High-Performance Computing, A*STAR, Singapore |
| Pseudocode | No | The paper includes a framework diagram (Figure 1) and mathematical equations, but no explicit pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any explicit statements about releasing source code or links to a code repository for the described methodology. |
| Open Datasets | Yes | We adopt two popular large benchmark datasets for trajectory analysis in our experiments, namely Geo Life (Zheng, Xie, and Ma 2010) and Porto (Moreira Matias et al. 2016). |
| Dataset Splits | Yes | For both datasets, we randomly chose trajectories to keep the training, validation, and test ratio approximately as 1:1:1. |
| Hardware Specification | Yes | All experiments are conducted with Ge Force RTX 3090 GPU. |
| Software Dependencies | No | The paper mentions 'Adam optimizer' but does not provide specific version numbers for software dependencies or libraries. |
| Experiment Setup | Yes | The margin γ in Eq. 6 is set to 12. We then train the trajectory embedding module using unsupervised contrastive learning with the loss function in Eq. 12. The hyperparameter τ in Eq. 12 is set to 0.05. Both phases are trained with the Adam optimizer and a learning rate of 0.0001. |