DRAGONN: Distributed Randomized Approximate Gradients of Neural Networks

Authors: Zhuang Wang, Zhaozhuo Xu, Xinyu Wu, Anshumali Shrivastava, T. S. Eugene Ng

ICML 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our extensive evaluation in vision and recommendation shows that to reach the same level of convergence, DRAGONN achieves up to 3.52 speedup in total training time over DGC.
Researcher Affiliation Collaboration 1Computer Science Department, Rice University, Houston, TX, USA 2Third AI Corp, Houston, TX, USA.
Pseudocode Yes Algorithm 1 DRAGONN
Open Source Code No The paper does not provide an explicit statement or link indicating that the source code for the described methodology is publicly available.
Open Datasets Yes We evaluate DRAGONN and baseline methods on the DDT of ResNet50 over ImageNet-1K (Deng et al., 2009) dataset. We also evaluate DRAGONN and baseline methods over ViT and MLP-Mixer on fine-tuning tasks: given weights pretrained on ImageNet-21k (Ridnik et al., 2021), we perform DDT on Cifar10 (Krizhevsky et al., 2009). We use the Wiki10-31K dataset in the extreme classification repository (Bhatia et al., 2016).
Dataset Splits No The paper does not explicitly provide details about training, validation, or test dataset splits (e.g., percentages or sample counts for each split). While standard datasets are used, the specific splitting methodology or validation set usage for reproduction is not detailed.
Hardware Specification Yes We perform experiments on 16 Nvidia Tesla V100 32GB GPUs. Each machine has 8 GPUs, 96 CPU cores (Intel Xeon 8260 at 2.40GHz) and 256 GB of RAM.
Software Dependencies Yes The machines run Debian 10 operating system and the software environment includes CUDA 11.0, PyTorch-1.8.0, NCCL-2.7.8., and Hovorod-0.19.1.
Experiment Setup Yes We use Adam (Kingma & Ba, 2014) as optimizer with batch size 64 and learning rate 0.1. After linearly warming up the learning rate, we reduce by 10 on the 30th, 60th and 80th epochs.