Deterministic Mini-batch Sequencing for Training Deep Neural Networks

Authors: Subhankar Banerjee, Shayok Chakraborty6723-6731

AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our extensive empirical analyses on three challenging datasets corroborate the merit of our framework over competing baselines. We further study the performance of our framework on two other applications besides classification (regression and semantic segmentation) to validate its generalizability.
Researcher Affiliation Academia Subhankar Banerjee, Shayok Chakraborty Department of Computer Science, Florida State University
Pseudocode Yes Algorithm 1 The Proposed MMD-based Mini-batch Selection Algorithm
Open Source Code No The paper does not provide an explicit statement about releasing its own source code or a link to a code repository for the methodology described.
Open Datasets Yes We studied the performance of our framework on three benchmark datasets: MNIST (Le Cun et al. 1998), CIFAR-10 (Krizhevsky 2009) and SVHN (Netzer et al. 2011).
Dataset Splits No The paper states, 'The train / test splits given in each of these datasets were used in our experiments.' However, it does not explicitly mention the use of a separate validation split or how it was derived or used.
Hardware Specification Yes The implementations were all performed in Matlab R2019b running on a workstation equipped with an NVIDIA Quadro RTX5000 GPU with 16GB memory.
Software Dependencies Yes The implementations were all performed in Matlab R2019b
Experiment Setup Yes The L2 regularizer was used with regularization parameter 0.0005. We used 0.001 as the initial learning rate and reduced it by a factor of 0.1 every 10 epochs. The stochastic gradient descent with momentum (SGDM) was used as the optimizer, where the momentum parameter was set at 0.9. All the experiments were run for 35 epochs with a mini-batch size of 50. A Gaussian kernel with parameter 1 was used in the MMD computations.