-
Prepare training dataset: FFHQ. More details are in DatasetPreparation.md
-
Download FFHQ dataset. Recommend to download the tfrecords files from NVlabs/ffhq-dataset.
-
Extract tfrecords to images or LMDBs (TensorFlow is required to read tfrecords):
python scripts/extract_images_from_tfrecords.py
-
-
Modify the config file in
options/train/StyleGAN/train_StyleGAN2_256_Cmul2_FFHQ.yml
-
Train with distributed training. More training commands are in TrainTest.md.
python -m torch.distributed.launch --nproc_per_node=8 --master_port=4321 basicsr/train.py -opt options/train/StyleGAN/train_StyleGAN2_256_Cmul2_FFHQ_800k.yml --launcher pytorch
-
Download pre-trained models from ModelZoo to the
experiments/pretrained_models
folder. -
Test.
python tests/test_stylegan2.py
-
The results are in the
samples
folder.
-
Install dlib, because DFDNet uses dlib to do face recognition and landmark detection. Installation reference.
- Clone dlib repo:
git clone git@github.com:davisking/dlib.git
cd dlib
- Install:
python setup.py install
- Clone dlib repo:
-
Download the dlib pretrained models from ModelZoo to the
experiments/pretrained_models/dlib
folder.
You can download by run the following command OR manually download the pretrained models.python scripts/download_pretrained_models.py --method dlib
-
Download pretrained DFDNet models, dictionary and face template from ModelZoo to the
experiments/pretrained_models/DFDNet
folder.
You can download by run the the following command OR manually download the pretrained models.python scripts/download_pretrained_models.py --method DFDNet
-
Prepare the testing dataset in the
datasets
, for example, we put images in thedatasets/TestWhole
folder. -
Test.
python tests/test_face_dfdnet.py --upscale_factor=2 --test_path datasets/TestWhole
-
The results are in the
results/DFDNet
folder.