CrossLoc3D: Aerial-Ground Cross-Source 3D Place Recognition
Tianrui Guan, Aswath Muthuselvam, Montana Hoover, Xijun Wang, Jing Liang, Adarsh Jagan Sathyamoorthy, Damon Conover, Dinesh Manocha
Representation gap between aerial and ground sources: We use the bounding box with the same color to focus on the same region and highlight the differences between aerial (left) and ground (right) LiDAR scans. Scopes ( cyan): The aerial scans cover a large region, while ground scans cover only a local area. Coverages ( green): The aerial scans cover the top of the buildings, while ground scans cover more details on the ground. Densities ( blue): The distribution and density of the points are different because of various scan patterns, effective ranges, and fidelity of LiDARs. Noise Patterns ( red): The aerial scans have larger noises, as we can see from a bird-eye view and top-down view of a corner of the building.
If you find this project useful in your research, please cite our work:
@InProceedings{Guan_2023_ICCV,
author = {Guan, Tianrui and Muthuselvam, Aswath and Hoover, Montana and Wang, Xijun and Liang, Jing and Sathyamoorthy, Adarsh Jagan and Conover, Damon and Manocha, Dinesh},
title = {CrossLoc3D: Aerial-Ground Cross-Source 3D Place Recognition},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
month = {October},
year = {2023},
}
conda create -n crossloc python=3.7 pandas tensorboard numpy -c conda-forge
conda activate crossloc
conda install pytorch=1.9.1 torchvision cudatoolkit=11.1 -c pytorch -c nvidia
conda install openblas-devel -c anaconda
sudo apt-get install openexr libopenexr-dev
conda install -c conda-forge openexr
pip install laspy pytest addict pytorch-metric-learning==0.9.97 yapf==0.40.1 bitarray==1.6.0 h5py transforms3d open3d
pip install tqdm setuptools==59.5.0 einops
pip install bagpy utm pptk
conda install -c conda-forge openexr-python
pip install pyexr pyntcloud
cd MinkowskiEngine
python setup.py install --blas_include_dirs=${CONDA_PREFIX}/include --blas=openblas
Follow instruction of this repo or download benchmark_datasets.zip from here and put /benchmark_datasets folder in /data folder.
python ./datasets/preprocess/generate_training_tuples_baseline.py
python ./datasets/preprocess/generate_test_sets.py
The dataset can be accessed here.
Download data and put /benchmark_datasets folder in /data folder.
CUDA_VISIBLE_DEVICES=0 python main.py ./configs/<config_file>.py
CUDA_VISIBLE_DEVICES=0 python main.py ./configs/<config_file>.py --mode val --resume_from <ckpt_location>.pth
Name | Dataset | config | ckpt |
---|---|---|---|
Crossloc3D | Oxford | config | ckpt |
Crossloc3D | CS-Campus3D | config | ckpt |