Self-paced Convolutional Neural Networks

Authors: Hao Li, Maoguo Gong

IJCAI 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results on MNIST and rectangles datasets demonstrate that the proposed method outperforms baseline methods.
Researcher Affiliation Academia Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education Xidian University, Xi an, China
Pseudocode Yes Algorithm 1 Algorithm of Self-paced Learning. ... Algorithm 2 Algorithm of Self-paced Convolutional Networks.
Open Source Code No The paper does not provide any specific links or explicit statements about the availability of its source code.
Open Datasets Yes we studied variants of MNIST digits [Larochelle et al., 2007] and the rectangle datasets [Larochelle et al., 2007] in the experiments. Each variant includes 10,000 labeled training, 2,000 labeled validation, and 50,000 labeled test images.
Dataset Splits Yes Each variant includes 10,000 labeled training, 2,000 labeled validation, and 50,000 labeled test images. The dataset rectangle consists of 1000 labeled training, 200 labeled validation, and 50,000 labeled test images. The dataset rectangle-image includes 10,000 labeled train, 2000 labeled validation and 50,000 labeled test images.
Hardware Specification No The paper does not provide any specific hardware details (e.g., GPU/CPU models, memory specifications) used for running the experiments.
Software Dependencies No The paper does not specify any software dependencies with version numbers.
Experiment Setup Yes The initial value of the pace sequence N1 is set to 5500, 6000, 6500, 7000 and 7500. The step size of the pace sequence is same to the batch size. ... Then q(t) can be also defined with respect to the sample number, which is shown as follows q(t) = 2 tan((1 Nt Nmaxgen + 1) π/4).