Open Anomalous Trajectory Recognition via Probabilistic Metric Learning

Authors: Qiang Gao, Xiaohan Wang, Chaoran Liu, Goce Trajcevski, Li Huang, Fan Zhou

IJCAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results on two large-scale trajectory datasets demonstrate the superiority of ATROM in addressing both known and unknown anomalous patterns.
Researcher Affiliation Academia 1Southwestern University of Finance and Economics, Chengdu, China, 611130 2University of Electronic Science and Technology of China, Chengdu, China, 610054 3Iowa State University, Iowa, USA
Pseudocode No The paper describes its methodology and components but does not include a clearly labeled 'Pseudocode' or 'Algorithm' block.
Open Source Code Yes For reproducibility, the source codes are available at https://github.com/ypeggy/ATROM.
Open Datasets Yes We conduct our experiments on two real-world taxi trajectory datasets. The first taxi dataset [Liu et al., 2020] is collected from 442 taxis operating in the city of Porto during Jan 07 2013 to Jun 30 2014. ... The second dataset is collected from Di Di Chuxing1, containing a large number of taxi traces generated from the city of Chengdu in Aug 2014. ... 1http://outreach.didichuxing.com/research/opendata/
Dataset Splits No For the training and testing setups, we use 90% of the trajectories as the training set and the rest as the testing set. The paper does not explicitly mention a separate validation set split.
Hardware Specification Yes We implemented ATROM and all the baselines in Python using the Py Torch library, accelerated by the NVIDIA Tesla A100.
Software Dependencies No The paper mentions 'Python using the Py Torch library' but does not specify version numbers for either, which is required for reproducibility of software dependencies.
Experiment Setup Yes In the implementation, we set d to 128, J is 10, the size of hidden state is 256, and the learning rate is initialized as 0.001. We use Adam as the optimization algorithm.