Compositional Kernel Machines

Authors: Robert Gens, Pedro Domingos

ICLR 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this paper we define CKMs, explore their properties, and present promising results on NORB datasets. Experiments show that CKMs can outperform SVMs and be competitive with convnets in a number of dimensions, by learning symmetries and compositional concepts from fewer samples without data augmentation.
Researcher Affiliation Academia Robert Gens & Pedro Domingos Department of Computer Science & Engineering University of Washington Seattle, WA 98195, USA {rcg,pedrod}@cs.washington.edu
Pseudocode No The paper describes algorithms and procedures in prose and with mathematical formulations but does not include any clearly labeled 'Pseudocode' or 'Algorithm' blocks or figures.
Open Source Code No The paper does not provide a link to its own source code for the described methodology or make a clear statement about its availability. It only refers to a third-party library, TensorFlow, which is a general framework, not their specific implementation.
Open Datasets Yes We test CKMs on three image classification scenarios that feature images from either the small NORB dataset or the NORB jittered-cluttered dataset (Le Cun et al., 2004).
Dataset Splits Yes In this experiment, we partition the training set of NORB jittered-cluttered into a new dataset with 10% withheld for each of validation and testing.
Hardware Specification No The paper mentions running experiments on 'CPU' and 'GPU' ('CKM on a CPU' and 'convnets trained for much longer on a GPU') but does not specify any particular models (e.g., Intel Core i7, NVIDIA Tesla V100) or detailed hardware configurations.
Software Dependencies No The paper mentions 'Convnets and their features are computed using the Tensor Flow library (Abadi et al., 2015)' but does not provide a specific version number for TensorFlow or any other software dependency.
Experiment Setup Yes The hyperparameters of ORB feature extraction, leaf kernels, cost function, and optimization were chosen using grid search on a validation set.