To calculate the metrics presented in the article, see TRAINING.md. All models are trained with 2x NVIDIA V100-32GB, except configE which is trained with 4 V100's.
The following configs were used for the tables in the paper
configs/surface_guided/configA.py
configs/surface_guided/configB.py
configs/surface_guided/configC.py
configs/surface_guided/configD.py
configs/surface_guided/configE.py
configs/surface_guided/ablations/f_depth/configD_n0.py
configs/surface_guided/ablations/f_depth/configD_n2.py
configs/surface_guided/ablations/f_depth/configD_n4.py
configs/surface_guided/configD.py
configs/surface_guided/configB.py
configs/surface_guided/ablations/modulation/configB_spade.py
configs/surface_guided/ablations/modulation/configB_inade.py
configs/surface_guided/ablations/modulation/configB_clade.py
configs/surface_guided/ablations/modulation/configB_stylegan.py
configs/surface_guided/ablations/modulation/configB_comod.py
configs/surface_guided/configC.py
configs/surface_guided/configD.py
python -m tools.evaluation.affine_invariance configs/surface_guided/configD.py
python -m tools.evaluation.face_fid configs/surface_guided/configD.py
The following describes how to generate anonymized versions of the COCO dataset.
To generate the final metrics, see the documentation of detectron2.
We used configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml and commit 335b19830e4ea5c5a74a085a04ff4a2f1a1dbf71
.
- Detect on dataset first:
python3 -m tools.coco_anonymization.dump_detections configs/surface_guided/configE.py path/to/coco/dataset
- Then, anonymize the dataset:
python3 -m tools.coco_anonymization.anonymize_dataset configs/surface_guided/configE.py path/to/coco/dataset path/to/coco/dataset_anonymized
This will save the anonymized dataset to path/to/coco/dataset_anonymized
, which you then can use to train/validate with detectron2.
To reproduce results with different types of traditional anonymizers, replace configE.py
with a config from configs/anonymizers.