Parameter-Efficient Masking Networks

Authors: Yue Bai, Huan Wang, Xu Ma, Yitian Zhang, Zhiqiang Tao, Yun Fu

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We validate the potential of PEMN learning masks on random weights with limited unique values and test its effectiveness for a new compression paradigm based on different network architectures. Code is available at https://github.com/yueb17/PEMN. Our experiments conduct empirically validations on two aspects of our interests.
Researcher Affiliation Collaboration Yue Bai1, Huan Wang1,4 Xu Ma1 Yitian Zhang1 Zhiqiang Tao3 Yun Fu1,2,4 1Department of Electrical and Computer Engineering, Northeastern University 2Khoury College of Computer Science, Northeastern University 3School of Information, Rochester Institute of Technology 4AInnovation Labs, Inc.
Pseudocode Yes Algorithm 1 One-Layer; Algorithm 2 Max-Layer Padding; Algorithm 3 Random Vector Padding
Open Source Code Yes Code is available at https://github.com/yueb17/PEMN.
Open Datasets Yes We use CIFAR10 and CIFAR100 datasets [18] for our experiments.
Dataset Splits No The paper uses CIFAR10 and CIFAR100 datasets but does not explicitly provide specific training/validation/test dataset splits (percentages or sample counts) in the main text. The checklist indicates this information is in the supplementary material, but it's not present in the main paper.
Hardware Specification No The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running its experiments in the main text. The checklist indicates this information is in the supplementary material, but it's not present in the main paper.
Software Dependencies No The paper does not provide specific ancillary software details (e.g., library or solver names with version numbers) in the main text.
Experiment Setup No The paper does not provide specific experimental setup details (concrete hyperparameter values, training configurations, or system-level settings) in the main text. The checklist indicates this information is in the supplementary material, but it's not present in the main paper.