On Contrastive Representations of Stochastic Processes

Authors: Emile Mathieu, Adam Foster, Yee Teh

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirically, we show that our methods are effective for learning representations of periodic functions, 3D objects and dynamical processes.
Researcher Affiliation Collaboration Emile Mathieu , Adam Foster , Yee Whye Teh , {emile.mathieu, adam.foster, y.w.teh}@stats.ox.ac.uk, Department of Statistics, University of Oxford, United Kingdom Deep Mind, United Kingdom
Pseudocode No The paper does not contain any pseudocode or algorithm blocks.
Open Source Code Yes Our code is publicly available at github.com/ae-foster/cresp.
Open Datasets Yes We apply CRESP to Shape Net (Chang et al., 2015), a standard dataset in the field of 3D object representations.
Dataset Splits No The paper mentions 'training views' and 'test views' but does not specify clear train/validation/test dataset splits with percentages, absolute counts, or references to predefined validation sets.
Hardware Specification No The paper does not specify the exact hardware used for experiments (e.g., specific GPU or CPU models, memory, or cluster specifications).
Software Dependencies No The paper does not provide specific software dependencies with version numbers.
Experiment Setup Yes Please refer to Appendix D for full experimental details. [...] We train all models for 200 epochs, varying the distance between modes and the number of training context points. [...] They are trained for 200 epochs, with contexts of 5 randomly sampled pairs {yi = F(xi), xi U([0, 1])}.