Epipolar-Free 3D Gaussian Splatting for Generalizable Novel View Synthesis
Authors: Zhiyuan Min, Yawei Luo, Jianwen Sun, Yi Yang
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We evaluate e Free Splat on wide-baseline novel view synthesis tasks using the Real Estate10K and ACID datasets. Extensive experiments demonstrate that e Free Splat surpasses state-of-the-art baselines that rely on epipolar priors, achieving superior geometry reconstruction and novel view synthesis quality. |
| Researcher Affiliation | Academia | Zhiyuan Min1 Yawei Luo1, Jianwen Sun2 Yi Yang1 1Zhejiang University 2Central China Normal University |
| Pseudocode | No | The paper does not contain a clearly labeled section or figure titled 'Pseudocode' or 'Algorithm'. |
| Open Source Code | Yes | Project page: https://tatakai1.github.io/efreesplat/. |
| Open Datasets | Yes | e Free Splat is trained on Real Estate10K [72] and ACID [26]. |
| Dataset Splits | No | Following pixel Splat [6], we use the provided training and testing splits and evaluate three novel view images on each test scene. The paper does not explicitly mention a 'validation' split or its specific proportions/counts. |
| Hardware Specification | Yes | All models are trained on 4 RTX-4090 GPUs for 300, 000 iterations using the Adam optimizer [24]. |
| Software Dependencies | No | The paper mentions using specific models like 'Vi T-B vision transformer' and 'Cro Co v2' and an 'Adam optimizer', but it does not specify version numbers for any software libraries or dependencies (e.g., PyTorch, TensorFlow, specific Python versions). |
| Experiment Setup | Yes | All models are trained on 4 RTX-4090 GPUs for 300, 000 iterations using the Adam optimizer [24]. The per-GPU batch size during training is 4. ... using the Adam optimizer with a learning rate 2e-4. ... the resolution of our training and testing images for fair comparison (256x256). |