Neural Graph Embedding for Neural Architecture Search

Authors: Wei Li, Shaogang Gong, Xiatian Zhu4707-4714

AAAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments show the superiority of NGE over the state-of-the-art methods on image classification and semantic segmentation.
Researcher Affiliation Academia 1Queen Mary University of London, 2University of Surrey
Pseudocode Yes Algorithm 1: Neural Graph Embedding (NGE) for NAS
Open Source Code No The paper does not provide an explicit statement or link for open-source code for the described methodology.
Open Datasets Yes CIFAR. Both CIFAR-10 and CIFAR-100 (Krizhevsky and others 2009)... Image Net. For the large-scale image classification evaluation, we used the ILSVRC2012, a subset of Image Net (Russakovsky et al. 2015)... PASCAL VOC 2012. We used the PASCAL VOC 2012 (Everingham et al. 2015) for semantic segmentation evaluation.
Dataset Splits Yes We split 25K images from the training set for validation.
Hardware Specification Yes With NGE, the search on CIFAR-10 took only 2.4 hours on a single NVIDIA Tesla V100 GPU.
Software Dependencies No The paper mentions optimizers (SGD, Adam) and activation functions (ReLU) but does not provide specific version numbers for any software libraries, frameworks, or programming languages used.
Experiment Setup Yes For the network parameter w, we used SGD with an initial learning rate 0.025 and the momentum of 0.9. We decayed the learning rate to 0 during training using a cosine schedule. A weight decay of 3 × 10−4 was imposed to avoid over-fitting. For the NGE learning, we used the Adam optimiser with a fixed learning rate 6 × 10−4 and set the weight decay to 1 × 10−3. To search the normal cell and reduction cell efficiently, we used 25 epochs for training the proxy network.