DeepHardMark: Towards Watermarking Neural Network Hardware
Authors: Joseph Clements, Yingjie Lao4450-4458
AAAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We evaluate the performance of the hardware watermarking scheme on popular image classification models using various accelerator designs. Our results demonstrate that the proposed methodology effectively embeds watermarks while preserving the original functionality of the hardware architecture. Specifically, we can successfully embed watermarks into the deep learning hardware and reliably execute a Res Net Image Net classifier with an accuracy degradation of only 0.009%. |
| Researcher Affiliation | Academia | Joseph Clements, Yingjie Lao Department of Electrical and Computer Engineering, Clemson University Clemson, South Carolina, 29634 jfcleme@g.clemson.edu, ylao@clemson.edu |
| Pseudocode | Yes | Algorithm 1: Block Constrained Perturbations and Algorithm 2: Reducing Intra-block Perturbations are provided. |
| Open Source Code | No | The paper does not provide an explicit statement or link to its own open-source code for the methodology described. |
| Open Datasets | Yes | We conduct these experimental evaluations on multiple image classification models for the Cifar10, Cifar100, and Image Net datasets. |
| Dataset Splits | No | The paper mentions using a 'validation dataset' in the definition of the Fid metric, but does not specify the explicit split percentages, counts, or methodology for the training, validation, and test datasets. |
| Hardware Specification | No | The paper describes the target hardware architecture for watermarking (MMU, FPGA/ASIC design) and the software used for simulations (Pytorch), but it does not specify the hardware used to run these simulations or experiments. |
| Software Dependencies | No | The software simulations are developed using the deep learning package, Pytorch. A specific version number for PyTorch or any other software dependency is not provided. |
| Experiment Setup | Yes | Algorithm 1 lists hyperparameters: c, Tδ, TB, ϵδ, ϵB, ρ1, ρ2, ρ3. Algorithm 2 lists hyperparameter C. These parameters define the experimental setup for the algorithms. |