DeepDSL: A Compilation-based Domain-Specific Language for Deep Learning

Authors: Tian Zhao, Xiao Bing Huang, Yu Cao

ICLR 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We evaluated Deep DSL with a number of popular DL networks. Our experiments show that the compiled programs have very competitive runtime performance and memory efficiency compared to the existing libraries.
Researcher Affiliation Academia Tian Zhao & Xiao Bing Huang Department of Computer Science University of Wisconsin Milwaukee Milwaukee, WI, USA {tzhao,xiaobing}@uwm.edu Yu Cao Department of Computational Neuroscience The University of Massachusetts, Lowell Lowell, MA, USA ycao@cs.uml.edu
Pseudocode No The paper includes code examples in Scala (Figure 2 and subsequent code blocks) but does not provide structured pseudocode or clearly labeled algorithm blocks.
Open Source Code Yes Deep DSL is available at https://github.com/deepdsl/deepdsl.
Open Datasets Yes All our tests are trained with Image Net images that have been resized to 224 by 224
Dataset Splits No The paper mentions training and testing data but does not provide specific details on validation dataset splits, percentages, or sample counts.
Hardware Specification Yes The tests are run on a server with a single NVIDIA Tesla K40C GPU equipped with 12 gigabytes of memory. The server runs the Cent OS 7 Linux distribution.
Software Dependencies Yes Deep DSL uses the JCuda 0.8.0RC binding that runs against CUDA 8.0.279.
Experiment Setup Yes Figure 2 shows the complete implementation for compiling a program to train and test Lenet... val solver = Train('lenet', 1000, 10, 0.01f, 0.9f, 0.0005f, 0) // output file, train and test iteration, learn rate, momentum, decay, gradient cropping (0 means none)