LREN: Low-Rank Embedded Network for Sample-Free Hyperspectral Anomaly Detection

Authors: Kai Jiang, Weiying Xie, Jie Lei, Tao Jiang, Yunsong Li4139-4146

AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments demonstrate the efficacy of LREN in generalizing to unknown targets and different datasets.
Researcher Affiliation Academia Kai Jiang , Weiying Xie , Jie Lei, Tao Jiang, Yunsong Li State Key Laboratory of Integrated Services Networks, Xidian University, Xi an 710071, China
Pseudocode No The paper does not contain any explicitly labeled 'Pseudocode' or 'Algorithm' blocks, nor does it present structured steps formatted like code.
Open Source Code Yes Code available at https://github.com/xdjiangkai/LREN.
Open Datasets Yes We evaluate our LREN on four benchmark hyperspectral datasets, including San Diego (Xu et al. 2016), Hydice (Li and Du 2014), Coast (Kang et al. 2017), and Pavia (Kang et al. 2017).
Dataset Splits No The paper describes training parameters and hyperparameter tuning but does not explicitly detail a training/validation/test dataset split or cross-validation setup.
Hardware Specification Yes We implement our method by Tensor Flow on one NVIDIA 2080 Ti GPUs with 8 GB memory.
Software Dependencies No The paper mentions 'TensorFlow' but does not specify its version number or versions for any other key software libraries.
Experiment Setup Yes The number of hidden nodes in the deep latent space is set to 9. We train LREN with SGD in an end-to-end fashion setting the learning rate to 10 4 and the batch size to the number of input spatial pixels. We terminate the learning process in 1000 epochs.