Deep Scale-spaces: Equivariance Over Scale

Authors: Daniel Worrall, Max Welling

NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate our networks on the Patch Camelyon and Cityscapes datasets, to prove their utility and perform introspective studies to further understand their properties.
Researcher Affiliation Collaboration Daniel E. Worrall AMLAB, Philips Lab University of Amsterdam d.e.worrall@uva.nl Max Welling AMLAB, Philips Lab University of Amsterdam m.welling@uva.nl
Pseudocode No No pseudocode or algorithm blocks were explicitly labeled or formatted.
Open Source Code No The paper mentions 'deworrall92.github.io' but does not explicitly state that the source code for the described methodology is available there, nor does it provide a direct repository link or specific statement of code release.
Open Datasets Yes Patch Camelyon [Veeling et al., 2018] and Cityscapes [Cordts et al., 2016] datasets.
Dataset Splits Yes The Cityscapes dataset [Cordts et al., 2016] contains 2975 training images, 500 validation images, and 1525 test images of resolution 2048 × 1024 px.
Hardware Specification No The paper mentions training 'split over 4 GPUs' but does not provide specific models or other hardware details (e.g., CPU, memory).
Software Dependencies No No specific software dependencies with version numbers (e.g., libraries, frameworks, or programming language versions) are mentioned.
Experiment Setup Yes Our training procedure is: 100 epochs SGD, learning rate 0.1 divided by 10 every 40 epochs, momentum 0.9, batch size of 512, split over 4 GPUs.