Online Multi-Target Tracking Using Recurrent Neural Networks
Authors: Anton Milan, S. Hamid Rezatofighi, Anthony Dick, Ian Reid, Konrad Schindler
AAAI 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments on both synthetic and real data show promising results obtained at 300 Hz on a standard CPU, and pave the way towards future research in this direction. |
| Researcher Affiliation | Academia | 1School of Computer Science, The University of Adelaide 2Photogrammetry and Remote Sensing, ETH Z urich 1{firstname.lastname}@adelaide.edu.au, 2schindler@geod.baug.ethz.ch |
| Pseudocode | No | The paper does not contain any pseudocode or clearly labeled algorithm blocks. |
| Open Source Code | Yes | Both our entire code base as well as pre-trained models are publicly available.2 https://bitbucket.org/amilan/rnntracking |
| Open Datasets | Yes | We further test our approach on real-world data, using the MOTChallenge 2015 benchmark (Leal-Taix e et al. 2015). |
| Dataset Splits | No | The paper mentions “11/11 for training and testing” for MOTChallenge, but does not specify a separate validation split or how validation was handled for the 100K sequences generated. |
| Hardware Specification | No | obtained at 300 Hz on a standard CPU |
| Software Dependencies | Yes | We implemented our framework in Lua and Torch7. |
| Experiment Setup | Yes | The RNN for state estimation and track management is trained with one layer and 300 hidden units. The data association is a more complex task, requiring more representation power. To that end, the LSTM module employed to learn the data association consists of two layers and 500 hidden units. We use the RMSprop (Tieleman and Hinton 2012) to minimise the loss. The learning rate is set initially to 0.0003 and is decreased by 5% every 20 000 iterations. We set the maximum number of iterations to 200 000 |