Mitigating Privacy Risk in Membership Inference by Convex-Concave Loss
Authors: Zhenlong Liu, Lei Feng, Huiping Zhuang, Xiaofeng Cao, Hongxin Wei
ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section, we validate the effectiveness of our CCL across a wide range of datasets with diverse models, various attack models, and multiple defense baselines. |
| Researcher Affiliation | Academia | Zhenlong Liu 1 Lei Feng 2 Huiping Zhuang 3 Xiaofeng Cao 4 Hongxin Wei 1 1Southern University of Science and Technology 2Singapore University of Technology and Design 3South China University of Technology 4Jilin University. |
| Pseudocode | No | The paper describes the proposed method in prose and mathematical equations but does not include any clearly labeled pseudocode or algorithm blocks. |
| Open Source Code | Yes | Our code is available at https://github.com/ml-stat-Sustech/Convex Concave Loss. |
| Open Datasets | Yes | In our evaluation, we employ five datasets: Texas100 (Texas Department of State Health Services, 2006), Purchase100 (Kaggle, 2014), CIFAR-10, CIFAR100 (Krizhevsky et al., 2009), and Image Net (Russakovsky et al., 2015). |
| Dataset Splits | Yes | For standard training methods, we split each dataset into four subsets, with each subset serving alternately as the training or testing set for the target and shadow models. |
| Hardware Specification | No | The paper does not provide specific details regarding the hardware (e.g., GPU models, CPU types, or cloud computing instances) used for running the experiments. |
| Software Dependencies | No | The paper does not specify version numbers for any software dependencies or libraries used in the implementation or experimentation. |
| Experiment Setup | Yes | We train the models using SGD with a momentum of 0.9, a weight decay of 0.0005, and a batch size of 128. We set the initial learning rate as 0.1 and drop it by a factor of 10 at each decay epoch. |