Sparse Gaussian Conditional Random Fields on Top of Recurrent Neural Networks
Authors: Xishun Wang, Minjie Zhang, Fenghui Ren
AAAI 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Co R is evaluated by both synthetic data and real-world data, and it shows a significant improvement in performance over state-of-the-art methods. |
| Researcher Affiliation | Academia | Xishun Wang, Minjie Zhang, Fenghui Ren School of Computing and Information Technology, University of Wollongong, 2522, NSW, Australia. xw357@uowmail.edu.au; {minjie, fren}@uow.edu.au |
| Pseudocode | Yes | Algorithm 1 Alternative training of Co R |
| Open Source Code | No | The paper mentions using Theano and Lasagne for implementation, and provides a link to the NPower Forecasting Challenge data and results, but does not explicitly state that the source code for their proposed Co R model is publicly available or provide a link to it. |
| Open Datasets | Yes | We apply Co R to an electricity demand prediction problem, which is a competition called NPower Forecasting Challenge 2016. This competition adopted a rolling forecasting mode to simulate the real-world scenario. [...] 1https://www.npowerjobs.com/graduates/forecastingchallenge. Data are publicly available. |
| Dataset Splits | No | In each evaluation, random 80% samples are used for training, while the rest are for testing. |
| Hardware Specification | Yes | Evaluations are conducted on a server with 8 CPUs and 64 GB Memory. |
| Software Dependencies | No | In implementations, we use Theano (Theano Development Team 2016) and Lasagne (Dieleman et al. 2015) for deep neural network. |
| Experiment Setup | Yes | The synthetic data are generated as follows. The dimension of feature D is fixed as 10, while the time step T can be varied in {10, 20, 40}, and number of samples N can be varied in {1000, 2000, 4000, 8000}. |