Static-Dynamic Interaction Networks for Offline Signature Verification
Authors: Huan Li, Ping Wei, Ping Hu1893-1901
AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | The proposed method was evaluated on four popular datasets of different languages. The extensive experimental results manifest the strength of our model. |
| Researcher Affiliation | Academia | Huan Li, Ping Wei*, Ping Hu Xi an Jiaotong University, Xi an, China huanli@stu.xjtu.edu.cn, pingwei@xjtu.edu.cn, helenhu@xjtu.edu.cn |
| Pseudocode | No | The paper describes the model architecture and processes textually and with equations, but does not provide pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any explicit statement about releasing source code or a link to a code repository. |
| Open Datasets | Yes | We test our SDINet model on four public signature datasets: CEDAR Dataset (Kalera and Xu 2004), BHSig-B Dataset (Pal et al. 2016), BHSig-H (Pal et al. 2016), and GPDS Synthetic Signature Database (Ferrer, Diaz-Cabrera, and Morales 2015a). |
| Dataset Splits | No | The paper specifies train and test splits for each dataset but does not explicitly mention a separate validation set split, e.g., 'Referring to previous approaches, 50 people s signatures are used to train our model and the rest of 5 people s signatures for test'. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running its experiments. |
| Software Dependencies | Yes | We construct the proposed model based on Tensor Flow 1.8.0. |
| Experiment Setup | Yes | All signature images are preprocessed by removing backgrounds using OTSU algorithm (Otsu 1979) and non-standard Binarization that is the same as (Wei, Li, and Hu 2019). We resize all images to the same size of 155 220. The parameters of batch normalization layer are set as decay=0.99 and ϵ = 10 5 respectively. |