Self-Guided Masked Autoencoder
Authors: Jeongwoo Shin, Inseo Lee, Junho Lee, Joonseok Lee
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Comprehensive experiments on various downstream tasks verify the effectiveness of the proposed method. |
| Researcher Affiliation | Collaboration | Jeongwoo Shin1, Inseo Lee1, Junho Lee1, Joonseok Lee1,2 1Seoul National University, 2Google Research |
| Pseudocode | No | No explicit pseudocode or algorithm blocks were found in the paper. |
| Open Source Code | Yes | Does the paper provide open access to the data and code, with sufficient instructions to faithfully reproduce the main experimental results, as described in supplemental material? Answer: [Yes] Justification: We plan to publicly provide our code used in this paper. |
| Open Datasets | Yes | We pre-train all competing models for 400 epochs on Image Net-1K [12], and fine-tune on 3 downstream tasks: image classification, object detection, and semantic segmentation. [...] We use CIFAR-100 [30], i Naturalist 2019 [48], and CUB200-2011 [49] for image classification. We fine-tune our model on COCO [36] for object detection, and on ADE20K [62] for semantic segmentation. |
| Dataset Splits | Yes | all experiments have been conducted on 10% of Image Net-1K training set, unless noted otherwise. [...] We additionally measure the feature variance (σF ) and variance of the pairwise similarities (σS), on the Image Net-1K validation set: |
| Hardware Specification | Yes | We conduct experiments on 8 NVidia A6000 GPUs (48GB). |
| Software Dependencies | No | The paper does not specify particular software dependencies with version numbers. |
| Experiment Setup | Yes | We pre-train all competing models for 400 epochs on Image Net-1K [12] [...] We fine-tune a Mask R-CNN model [24] end-to-end on COCO with a Vi T backbone for 90K iterations [...] We fix the masking ratio to 0.75 for all the experiments in this section. |