Implementation of Deformable Convolution V2 in Pytorch 1.0 codes
- The only repo that tries to reproduce the total set up of Deformable Convolution V2 in full stable Pytorch codes
- The cuda codes are ported from MXNET to Pytorch, including Modulated Deformable Convolution and Modulated ROI Pooling , supporting stable pytorch1.0 version, gradient test code is provided
- Full training and test details are provided base on the framework of maskrcnn-benchmark,Feature Mimicking branch is implemented
- train.py only support single image per batch right now due to I don't have enough resource to run multi-batch and multi-cards training, but you can easily upgrade it to support batch and multi-card training if you check on the original train_net.py from the original repo, because the framework supports everything.
- The training is ongoing, so the results and pretrained models will be published later due to it's really slow to train in my single card system.
- Inspired by the idea from Rethinking ImageNet Pre-training, the model is trained from scratch, instead of finetuning from an Imagenet pretrained model
- Due to the same issue, Batchnorm is replaced by Groupnorm
- Weights for different branches are adjusted, and OHEM is used, compare to the original paper.
git clone git@github.com:TreB1eN/Deformable_Convolution_V2_Pytorch1.0.git
cd Deformable_Convolution_V2_Pytorch1.0/
-
Install PyTorch 1.0 and torchvision following the official instructions.
-
Install dependencies
pip install -r requirements.txt
-
Compile cuda ops.
./compile.sh # or "PYTHON=python3 ./compile.sh" if you use system python3 without virtual environments
Incoming
Incoming
For the following examples to work, you need to download the COCO dataset.
We recommend to symlink the path to the coco dataset to datasets/
as follows
We use minival
and valminusminival
sets from Detectron
# symlink the coco dataset
cd ~/github/maskrcnn-benchmark
mkdir -p datasets/coco
ln -s /path_to_coco_dataset/annotations datasets/coco/annotations
ln -s /path_to_coco_dataset/train2014 datasets/coco/train2014
ln -s /path_to_coco_dataset/test2014 datasets/coco/test2014
ln -s /path_to_coco_dataset/val2014 datasets/coco/val2014
# for pascal voc dataset:
ln -s /path_to_VOCdevkit_dir datasets/voc
Deformable_Convolution_V2
├── work_space
│ ├── model
│ ├── log
│ ├── final
│ ├── save
For training, just run:
python train.py
All detailed configuration is in configs/e2e_deformconv_mask_rcnn_R_50_C5_1x.yaml
Email : treb1en@qq.com
Questions and PRs are welcome, especially in helping me to get better trained models, due to I don't have enough resource to trained it sufficiently