Skip to content

caozidong/CRF360D

Repository files navigation

CRF360D

Office source code of paper Monocular 360 Depth Estimation via Spherical Fully-Connected CRFs, arXiv, Project page

Preparation

Installation

Environments

  • python 3.9
  • Pytorch 1.13.0, CUDA 11.7, torchvision 0.14.0
  • Platform NVIDIA 3090

Install requirements

pip install -r requirements.txt

Datasets

Please download the preferred datasets, i.e., Matterport3D, Stanford2D3D. For Matterport3D, please preprocess it following (UniFuse/Matterport3D/README.md).

Training

CRF360D on Matterport3D

python train.py --config ./configs/train_matterport3d/b5_matterpot3d.yaml

CRF360D on Stanford2D3D

python train.py --config ./configs/train_stanford2d3d/b5_stanford2d3d.yaml

It is similar for other datasets, such as Structured3D dataset.

Evaluation

Pre-trained models

The pre-trained models of CRF360D for 2 datasets are available, Matterport3D, and Stanford2D3D.

Test on a pre-trained model

python evaluate.py --config ./configs/test_matterport3d/panocrf_b5.yaml 

Citation

Please cite our paper if you find our work useful in your research.

@article{cao2024crf360d,
  title={CRF360D: Monocular 360 Depth Estimation via Spherical Fully-Connected CRFs},
  author={Cao, Zidong and Wang, Lin},
  journal={arXiv preprint arXiv:2405.11564},
  year={2024}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages