Bayesian online change point detection with Hilbert space approximate Student-t process
Authors: Jeremy Sellier, Petros Dellaportas
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Improvements in prediction and training time are demonstrated with real-world data sets. |
| Researcher Affiliation | Academia | 1Department of Statistical Science, University College London, UK 2Department of Statistics, Univ. of Econ. and Business, Athens, Greece 3The Alan Turing Institute, UK. |
| Pseudocode | Yes | Algorithm 1 BOCPD run length estimation; Algorithm 2 HSSPAR-CP UPM implementation |
| Open Source Code | No | The paper does not provide any explicit statement about releasing source code or a link to a code repository for the methodology described. |
| Open Datasets | Yes | The Nile data set records the lowest annual water levels of the Nile river during the period 622-1284. The data has been used for change point detection in Garnett et al. (2009) and Saatc i et al. (2010). The Well Log data set contains 4050 measurements of radioactivity taken during the drilling of a well. These data have been studied in the context of change point detection by Ruanaidh & Fitzgerald (2012) and by Fearnhead & Clifford (2003). |
| Dataset Splits | No | The paper specifies training and test sets for the datasets (e.g., '200 training points, 463 Test points' for Nile Data), but does not explicitly mention a separate validation dataset split. |
| Hardware Specification | No | The paper does not provide any specific details about the hardware (e.g., CPU, GPU models, cloud instances) used for running the experiments. |
| Software Dependencies | No | The paper mentions 'scipy method linalg.blas.dger for Python' but does not provide specific version numbers for these software components. |
| Experiment Setup | Yes | We use a hazard function with a trainable constant hazard rate h initialized at 100... Our implementations of HSSPAR and HSGPAR use the Hilbert space reduced-rank kernel derived from Gaussian kernels with the number of basis functions m ranging from 5 to 15. For auto-regressive UPM (GPAR and HSSPAR variants), we use lag parameter p = 1, 2, 3. |