Prompt Certified Machine Unlearning with Randomized Gradient Smoothing and Quantization

Authors: Zijie Zhang, Yang Zhou, Xin Zhao, Tianshi Che, Lingjuan Lyu

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirical evaluation on real datasets demonstrates the superior performance of our PCMU model against several state-of-the-art machine unlearning methods on image classification.
Researcher Affiliation Collaboration 1Auburn University, 2Sony AI
Pseudocode No The paper describes the proposed algorithms using mathematical formulations and textual descriptions but does not include structured pseudocode or algorithm blocks.
Open Source Code No We include the citations and URLs of all datasets used in this work and all codes of third-party baselines in Sections 5 and A.7. Since the datasets used are all public datasets and our methodologies and the hyperparameter settings are explicitly described in Section 3, 4, 5, and A.7, our codes and experiments can be easily reproduced on top of a GPU server. This statement implies reproducibility but does not explicitly state that their source code for their method is provided or linked.
Open Datasets Yes We train a convolutional neural network (CNN) on Fashion-MNIST [138, 50, 37] for clothing classification. We train Le Net over CIFAR-10 [66, 43, 44, 121, 50, 37] for image classification. We apply the Res Net-18 architecture on SVHN [95, 49, 7] for street view house number identification.
Dataset Splits Yes We train the models from scratch by splitting training data into 90% as training set and 10% as validation set for all cases.
Hardware Specification Yes All experiments were run on a single NVIDIA A6000 GPU.
Software Dependencies Yes Our algorithms are implemented based on PyTorch 1.12.1 and Python 3.9.12.
Experiment Setup Yes The learning rate is initialized as 0.01 with the cosine annealing scheduler. We choose Adam optimizer with a batch size of 128. The models are trained for 200 epochs.