Online Credit Payment Fraud Detection via Structure-Aware Hierarchical Recurrent Neural Network
Authors: Wangli Lin, Li Sun, Qiwei Zhong, Can Liu, Jinghua Feng, Xiang Ao, Hao Yang
IJCAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental results on a large-scale real-world transaction dataset from Alibaba show that our proposed model outperforms state-of-the-art approaches. |
| Researcher Affiliation | Collaboration | 1Alibaba Group, Hangzhou, China 2Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | The code is available at https://github.com/Wangli Lin/SAH-RNN. |
| Open Datasets | No | The paper states 'a large-scale dataset from Alibaba platform' was used, which is an internal proprietary dataset and not publicly available. No public access link or citation is provided. |
| Dataset Splits | Yes | Training 31,216 2,353,543 1.31% Testing 6,269 4,54,155 1.36% (...) We randomly extract 10% samples from the original training set for validation, and perform early stopping if the validation performance is not improved for 10 epochs. |
| Hardware Specification | No | The paper does not provide specific hardware details such as CPU/GPU models or memory specifications used for experiments. |
| Software Dependencies | No | The paper mentions 'All the models are implemented with Tensorflow', but it does not specify the version number of TensorFlow or any other software dependencies. |
| Experiment Setup | Yes | We limit the maximal length of the behavior sequence to 500 (...) For all the experiments, we under-sampled the negative examples to lift the ratio of positive samples (fraud transactions) at 10% in the training dataset. (...) we choose Adam [Kingma and Ba, 2015] as optimizer and decide the initial learning rate from {0.01, 0.001, 0.0001} via validation. We set the batch size to 512. |