DevFly: Bio-Inspired Development of Binary Connections for Locality Preserving Sparse Codes

Authors: Tianqi Wei, Rana Alkhoury Maroun, Qinghai Guo, Barbara Webb

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments show that the accuracy of searching for nearest neighbours is improved, although performance is dependent on the parameter values and datasets used.
Researcher Affiliation Collaboration Tianqi Wei School of Informatics University of Edinburgh Edinburgh, UK EH8 9AB tianqi-wei@outlook.com Rana Alkhoury Maroun School of Informatics University of Edinburgh Edinburgh, UK EH8 9AB rana.e.elkhoury@gmail.com Qinghai Guo ACS Lab Huawei Technologies Shenzhen, China guoqinghai@huawei.com Barbara Webb School of Informatics University of Edinburgh Edinburgh, UK EH8 9AB B.Webb@ed.ac.uk
Pseudocode Yes Algorithm 1 Connection development for Method 1 Algorithm 2 Connection development for Method 2 and 3
Open Source Code Yes The code is available in the supplementary materials. The code is available on Git Hub https://github.com/Insect Robotics/Dev Fly Publication.git.
Open Datasets Yes The datasets are MNIST (CC BY-SA 3.0), CIFAR-10 (MIT), SIFT10M and Glo Ve (Apache-2.0). MNIST (Le Cun et al., 1998) is a dataset... CIFAR-10 (Krizhevsky, 2009) is a dataset... SIFT10M (Jegou et al., 2010) also contains... The Glo Ve dataset (Pennington et al., 2014) contains...
Dataset Splits No The paper mentions 10000 samples used for training and 1000 for querying (testing), but does not specify a separate validation set or its split percentage/count.
Hardware Specification Yes Tested on a computer running Ubuntu 20.04 with Intel Core i9-10940X CPU with 28 hyperthreading logical cores.
Software Dependencies No The code was implemented in Python and most of the code for testing these two methods were shared. (No specific Python version or library versions mentioned).
Experiment Setup Yes We report the results for the three Dev Fly methods with different hyperparameters, varying k, m and α. In these experiments, α = 0.1, sparseness m/k = 20, but k varies.