Learning Transformations for Classification Forests

Authors: Qiang Qiu; Guillermo Sapiro

ICLR 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Theoretical and experimental results support the proposed framework. This section presents experimental evaluations using public datasets: the MNIST handwritten digit dataset, the Extended Yale B face dataset, and the 15-Scenes natural scene dataset.
Researcher Affiliation Academia Qiang Qiu QIANG.QIU@DUKE.EDU Department of Electrical and Computer Engineering, Duke University, Durham, NC 27708, USA Guillermo Sapiro GUILLERMO.SAPIRO@DUKE.EDU Department of Electrical and Computer Engineering, Department of Computer Science, Department of Biomedical Engineering, Duke University, Durham, NC 27708, USA
Pseudocode No The paper describes algorithms and processes (e.g., gradient descent, K-SVD) but does not present them in a structured pseudocode or algorithm block.
Open Source Code No The paper does not provide any statement or link indicating that its source code is publicly available.
Open Datasets Yes This section presents experimental evaluations using public datasets: the MNIST handwritten digit dataset, the Extended Yale B face dataset, and the 15-Scenes natural scene dataset.
Dataset Splits No The paper mentions a 'validation process' to choose tree depth but does not provide specific details on a validation dataset split (e.g., percentages or counts) separate from training and testing splits.
Hardware Specification No The paper does not specify any hardware details such as GPU/CPU models, memory, or specific computing environments used for the experiments.
Software Dependencies No The paper mentions methods like 'K-SVD' and 'C4.5' but does not specify any software names with version numbers or other dependencies needed for replication.
Experiment Setup Yes We train 20 classification trees with a depth of 9, each using only 10% randomly selected training samples. We train 20 classification trees with a depth of 5, each using all training samples. We train 30 classification trees with a depth of 9, each using 5% randomly selected training samples.