A Continuous Mapping For Augmentation Design

Authors: Keyu Tian, Chen Lin, Ser Nam Lim, Wanli Ouyang, Puneet Dokania, Philip Torr

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Results over multiple benchmarks demonstrate the efficiency improvement of this work compared with previous methods.
Researcher Affiliation Collaboration Keyu Tian Software College Beihang University tiankeyu.00@gmail.com Chen Lin University of Oxford chen.lin@eng.ox.ac.uk Ser-Nam Lim Facebook AI Wanli Ouyang University of Sydney Puneet K. Dokania University of Oxford & Five AI Ltd. Philip H.S. Torr University of Oxford philip.torr@eng.ox.ac.uk
Pseudocode Yes Algorithm 1 A Reference Method for ADA : MCMC-Aug
Open Source Code No Both source codes and checkpoints will be released to the public, and future work may concern more image modalities like infrared/X-rays/ultrasound imaging, more diverse augmentations, or wider applicants, e.g., in other computer vision areas or the times-series processing.
Open Datasets Yes We evaluate our method on CIFAR-10/100 [18] and Image Net [7].
Dataset Splits Yes For each datasets, a validation set is split from the training set and the testing set is only used for evaluating the final performance (not involved in the search phase).
Hardware Specification Yes Table 4: Comparison of the efficiency (GPU hours). GPU Device: V100 (for MCMC method).
Software Dependencies No No specific software dependencies with version numbers are mentioned in the paper.
Experiment Setup Yes All batch sizes for augmentation search mentioned in this work are set to 256 for CIFARs and 1024 for Image Net as standard. Then we run our MCMC-Aug with 200 epochs, the same as [16] over the training set. Step size and noise scale of SGLD is set according to [25]. The experiments reported have a fixed 0.4 step size, and a noise rate of 2 10 5. Other hyperparameters not mentioned here are directly imported from [28], provided in Supplementary C.