Skip to content
/ parsac Public

PARSAC: Accelerating Robust Multi-Model Fitting with Parallel Sample Consensus

License

Notifications You must be signed in to change notification settings

fkluger/parsac

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PARSAC: Accelerating Robust Multi-Model Fitting with Parallel Sample Consensus

The paper with supplementary material is available on arXiv:
https://arxiv.org/abs/2401.14919

If you use this code, please cite our paper:

@inproceedings{kluger2024parsac,
  title={PARSAC: Accelerating Robust Multi-Model Fitting with Parallel Sample Consensus},
  author={Kluger, Florian and Rosenhahn, Bodo},
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
  year={2024}
}

Related repositories:

Installation

Get the code:

git clone --recurse-submodules https://github.com/fkluger/parsac.git
cd parsac
git submodule update --init --recursive

Set up the Python environment using Anaconda:

conda env create -f environment.yml
source activate parsac

Datasets

HOPE-F

Download the HOPE-F dataset and extract it inside the datasets/hope directory. The small dataset w/o images is sufficient for training and evaluation.

Synthetic Metropolis Homographies

Download the SMH dataset and extract it inside the datasets/smh directory. The small dataset w/o images is sufficient for training and evaluation.

NYU-VP

The vanishing point labels and pre-extracted line segments for the NYU dataset are fetched automatically via the nyu_vp submodule.

YUD and YUD+

Pre-extracted line segments and VP labels are fetched automatically via the yud_plus submodule. RGB images and camera calibration parameters, however, are not included. Download the original York Urban Dataset from the Elder Laboratory's website and store it under the datasets/yud_plus/data subfolder.

Adelaide-H/-F

We provide a mirror of the Adelaide dataset here: https://cloud.tnt.uni-hannover.de/index.php/s/egE6y9KRMxcLg6T. Download it and place the .mat files inside the datasets/adelaide directory.

Evaluation

In order to reproduce the results from the paper using our pre-trained network, first download the neural network weights and then follow the instructions on the EVAL page.

Training

If you want to train PARSAC from scratch, please follow the instructions on the TRAIN page.

License

BSD License

About

PARSAC: Accelerating Robust Multi-Model Fitting with Parallel Sample Consensus

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages