Monitoring of a Dynamic System Based on Autoencoders

Authors: Aomar Osmani, Massinissa Hamidi, Salah Bouhouche

IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results, including hyper-parameter optimization on large real data and domain expert analysis, show that our proposed solution gives promising results. In this section, we evaluate our approach on a real industrial application dataset and demonstrate how it can reliably detect abnormal behavior of such industrial equipment.
Researcher Affiliation Collaboration 1Laboratoire LIPN-UMR CNRS 7030, PRES Sorbone Paris Cit e, France 2Industrial Technologies Research Center, CRTI-DTSI, Algiers, Algeria
Pseudocode Yes Algorithm 1 Continuous learning model
Open Source Code Yes Code to reproduce experiments is publicly available 1. 1https://www.github.com/hamidimassinissa/vibration-sae
Open Datasets No The paper states, 'Data were collected from a set of 10 sensors that continuously monitor a 102J turbo-compressor operating in a real application.' While it describes the dataset, it does not provide any concrete access information (link, DOI, specific repository, or formal citation for a public dataset) for the dataset used.
Dataset Splits No The paper mentions 'nominal training period ζ' and 'nominal control period η' in Algorithm 1 and 'nominal training periods ζ {200, 500, 1000, 1500, 2000}' in experiments, but it does not provide explicit details on standard train/validation/test splits using percentages, sample counts, or references to predefined splits.
Hardware Specification No The paper does not provide specific details about the hardware (e.g., CPU, GPU models, memory, or specific computer specifications) used for running the experiments.
Software Dependencies No The paper states: 'All of our experiments were implemented using py Torch framework [Paszke et al., 2017]'. While PyTorch is mentioned, a specific version number for the framework or any other software dependencies is not provided.
Experiment Setup Yes The paper provides several experimental setup details, including: 'minibatches of size bs', 'Adam algorithm', 'learning-rate lr and weight-decay d are optimized using the Bayesian optimization procedure', 'gradient clipping at 0.25', 'dropout is applied'. Also, Table 2 'summarizes the hyper-parameters being optimized along with their respective bounds and pairwise marginal importance'.