Toward Fine-grained Facial Expression Manipulation (ECCV 2020, Paper)
Arbitrary Facial Expression Manipulation. Our model can 1) perform continuous editing between two expressions (top); 2) learn to only modify one facial component(middle); 3) transform expression in paintings (bottom). From left to right, the emotion intensity is set to 0, 0.5, 0.75, 1, and 1.25.
Single/multiple au Editing. AU4: Brow Lowerer; AU5: Upper Lid Raiser; AU7: Lid Tightener; AU12: Lip Corner Puller; AU15: Lip Corner Depressor; AU20: Lip Stretcher. The legend below the images are relative AUs intensity. The higher (lower) AUs value means to strengthen (weaken) the corresponding facial action unit in input image.
Arbitrary Facial Expresson Manipulation. The top-left image with blue box is input, the images in odd rows are image with target expression, the images in even rows are animated results.
Here are some links for you to know Action Units better.
-
Install PyTorch(version==0.4.1 or >= 1.3.0), torchvision
-
Install
requirements.txt
numpy matplotlib tqdm pickle opencv-python tensorboardX face_alignment
-
prepare your images (EmotionNet, or AffectNet, etc.)
-
Extract the Action Units with OpenFace, and generate
aus_dataset.pkl
which contains a list of dict, e.g.,[{'file_path': <path of image1>, 'aus':<extracted aus of image1>}, {'file_path': <path of image2>, 'aus':<extracted aus of image2>}]
-
Please refer to
src/samples/aus_dataset.pkl
-
You may use the function of pickle to save
pkl
filewith open('aus_dataset.pkl', 'wb') as f: pickle.dump(data, f, pickle.HIGHEST_PROTOCOL)
To train, please modify the parameters in launch/train.sh
and run:
bash launch/train.sh
If you find this repository helpful, use this code or adopt ideas from the paper for your research, please cite:
@inproceedings{ling2020toward,
title={Toward Fine-grained Facial Expression Manipulation},
author={Ling, Jun and Xue, Han and Song, Li and Yang, Shuhui and Xie, Rong and Gu, Xiao},
booktitle={European Conference on Computer Vision},
pages={37--53},
year={2020},
organization={Springer}
}
Please contact lingjun@sjtu.edu.cn or open an issue for any questions or suggestions.
- Thanks Albert Pumarola for sharing his work GANimation.
- Thanks the authors of AffectNet for his work on large scale emotion database.