Sensor-Based Activity Recognition via Learning From Distributions
Authors: Hangwei Qian, Sinno Pan, Chunyan Miao
AAAI 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We conduct experiments on four benchmark datasets to verify the effectiveness and scalability of our proposed method. |
| Researcher Affiliation | Academia | Hangwei Qian, Sinno Jialin Pan, Chunyan Miao Joint NTU-UBC Research Centre of Excellence in Active Living for the Elderly, Interdisciplinary Graduate School, Nanyang Technological University, Singapore School of Computer Science and Engineering, Nanyang Technological University, Singapore qian0045@e.ntu.edu.sg, {sinnopan, ascymiao}@ntu.edu.sg |
| Pseudocode | No | The paper describes the methodology in text and equations, but does not include any pseudocode or clearly labeled algorithm blocks. |
| Open Source Code | No | The paper does not provide any links or explicit statements about releasing their source code. |
| Open Datasets | Yes | Four benchmark datasets are used in our experiments. The overall statistics of the datasets are listed in Table 1. Skoda (Stiefmeier, Roggen, and Tr oster 2007)... WISDM (Kwapisz, Weiss, and Moore 2010)... HCI (F orster, Roggen, and Tr oster 2009)... PS (Shoaib, Scholten, and Havinga 2013)... |
| Dataset Splits | Yes | In our experiments, each dataset is randomly split into training and testing sets using a ratio of 70% : 30%... We tune the kernel parameter γ as well as the tradeoff parameter C in Lib SVM, and choose optimal parameter settings based on 5-fold cross-validation on the training set. |
| Hardware Specification | Yes | The experiments are conducted on a Linux computer with Intel(R) Core(TM) i7-4790S 3.20GHz CPU. |
| Software Dependencies | No | The paper mentions LIBSVM but does not provide a version number, and no other software dependencies with version numbers are listed. |
| Experiment Setup | Yes | We tune the kernel parameter γ as well as the tradeoff parameter C in Lib SVM, and choose optimal parameter settings based on 5-fold cross-validation on the training set. PCA is conducted as preprocessing with 90% variance kept. |