Upping the Game: How 2D U-Net Skip Connections Flip 3D Segmentation

Authors: Xingru Huang, yihao guo, Jian Huang, Tianyun Zhang, HE HONG, Shaowei Jiang, Yaoqi Sun

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Through rigorous experimental validation on five publicly accessible datasets FLARE2021, OIMHS, Fe TA2021, Abdomen CT-1K, and BTCV, the proposed method surpasses contemporary state-of-the-art models.
Researcher Affiliation Academia Xingru Huang1 , Yihao Guo1 , Jian Huang1 , Tianyun Zhang1, Hong He1 , Shaowei Jiang1 , Yaoqi Sun1 1Hangzhou Dianzi University
Pseudocode No The paper describes the proposed methods and modules textually and with diagrams but does not include explicitly labeled pseudocode or algorithm blocks.
Open Source Code Yes Our implementation is available at https://github.com/IMOP-lab/U-Shaped-Connection.
Open Datasets Yes Through rigorous experimental validation on five publicly accessible datasets FLARE2021, OIMHS, Fe TA2021, Abdomen CT-1K, and BTCV, the proposed method surpasses contemporary state-of-the-art models.
Dataset Splits Yes The datasets are randomly partitioned in an 8:1:1 ratio for training, validation, and testing.
Hardware Specification Yes The experiments are conducted on identical hardware and software environments, each workstation equipped with two NVIDIA Ge Force RTX 4090 GPUs and 128GB of memory.
Software Dependencies Yes The framework employs Python 3.9, Py Torch 2.0.0, and MONAI 0.9.0 within a Distributed Data-Parallel (DDP) training framework.
Experiment Setup Yes All training utilizes the LDice CE function with the Adam W [55] optimizer, a learning rate of 0.0001, 80,000 training iterations, and a batch size of 2. Data augmentation techniques, including random flip, random rotation, random scaling, and random 3D elastic transformation, are applied to enhance dataset diversity and model generalization.