Equivariant Learning of Stochastic Fields: Gaussian Processes and Steerable Conditional Neural Processes
Authors: Peter Holderrieth, Michael J Hutchinson, Yee Whye Teh
ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In experiments with Gaussian process vector fields, images, and real-world weather data, we observe that Steer CNPs significantly improve the performance of previous models and equivariance leads to improvements in transfer learning tasks. |
| Researcher Affiliation | Collaboration | 1University of Oxford, United Kingdom 2Deep Mind, United Kingdom. |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide a specific repository link or an explicit statement about the release of source code for the methodology described. |
| Open Datasets | Yes | MNIST and rot MNIST. We first train models on completion tasks from the MNIST data set (Le Cun et al., 2010). |
| Dataset Splits | No | The paper mentions splitting the dataset into "train, validation and test data set" but does not provide specific percentages, sample counts, or a detailed splitting methodology within the provided text. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., exact GPU/CPU models, processor types, or memory amounts) used for running its experiments. |
| Software Dependencies | No | The paper mentions software tools like Pytorch, Num Py, Sci Py, and Matplotlib, but does not provide specific version numbers for these ancillary software components. |
| Experiment Setup | No | The paper describes the general training process including minimization of log-likelihood by gradient descent, but it does not provide specific hyperparameter values (e.g., learning rate, batch size, number of epochs, optimizer settings) in the main text. |