In this paper, we introduce a geometry-aware real-time rendering method for large-scale scenes. Compared to previous NeRF-based methods (such as Mega-Nerf and Switch-NeRF) and 3DGS-based methods (including 3DGS, GaMeS, and Scaffold-GS), our method renders large scenes with high fidelity while maintaining reasonable training time and memory usage. Extensive experiments on 13 scenes, including eight from four public datasets (MatrixCity-Aerial, Mill-19, Tanks & Temples, WHU) and five self-collected scenes from SCUT-CA and plateau regions, further demonstrate the generalization capability of our method.
We performe experiments on eight scenes from four public datasets from the Mega-NeRF, MatrixCity, 3D Gaussian Splatting and WHU dataset.
Mill-19 dataset: Please download the data from the Mega-NeRF
MatrixCity-Aerial dataset: Please download the data from the MatrixCity
Tanks & Temples dataset: Please download the data from the 3D Gaussian Splatting
WHU dataset: Please download the data from the WHU dataset
We performe experiments on five self-collected scenes(SCUT Campus and plateau regions).
Please concat us:
Haihong Xiao and Jianan Zou: auhhxiao@mail.scut.edu.cn; 202130450216@mail.scut.edu.cn
Prof. Wenxiong Kang: auwxkang@scut.edu.cn
We tested on a server configured with Ubuntu 20.04, cuda 11.8 and gcc 9.4.0. Other similar configurations should also work, but we have not verified each one individually.
- Clone this repo:
git clone https://github.com/SCUT-BIP-Lab/Geo_gs.git
cd Geo_gs
- Install dependencies
conda create -n Geo_gs python=3.8
conda activate Geo_gs
pip install torch==1.12.1+cu113 torchvision==0.13.1+cu113 torchaudio==0.12.1 --extra-index-url https://download.pytorch.org/whl/cu113
pip install -r requirements.txt
3、Data preparation
First, create a data/
folder inside the project path by
mkdir data
The data structure will be organised as follows:
data/
├── dataset_name
│ ├── scene1/
│ │ ├── images
│ │ │ ├── IMG_0.jpg
│ │ │ ├── IMG_1.jpg
│ │ │ ├── ...
│ │ ├── sparse/
│ │ └──0/
│ ├── scene2/
│ │ ├── images
│ │ │ ├── IMG_0.jpg
│ │ │ ├── IMG_1.jpg
│ │ │ ├── ...
│ │ ├── sparse/
│ │ └──0/
...
You can quickly train the model and obtain test results on MatrixCity-Aerial dataset using the following command:
conda activate Geo_gs
cd Geo_gs
bash single_train.sh
You can run other scene datasets by either modifying the previous bash file or executing the following command. For specific file modifications, please contact us Haihong Xiao and Jianan Zou, and we will provide assistance.
conda activate Geo_gs
cd Geo_gs
python train.py -s <path to COLMAP or NeRF Synthetic dataset> --eval # Train with train/test split
python render.py -m <path to trained model>
python metrics.py -m <path to trained model>
Visual comparisons on Mill-19 and MatrixCity dataset:
Visual comparisons on SCUT_CA outdoor scenes:
Visual comparisons on WHU dataset and Plateau Region scenes:
Our code follows several awesome repositories. We appreciate them for making their codes available to public.