- 🔔This is the official (Pytorch) implementation for the paper "MetaFBP: Learning to Learn High-Order Predictor for Personalized Facial Beauty Prediction", ACM MM 2023.
- 🛖This repository is built on the dragen1860/MAML-Pytorch. You can also view the project for details: https://github.com/dragen1860/MAML-Pytorch
The main python libraries we use:
- Python 3.8
- torch 1.8.1
- numpy 1.19.2
Please create a directory named datasets
in current directory, then download these following datasets and unzip into datasets
:
Download link: GoogleDrive or Quark
You can also change the root directory of datasets by modifying the default value of the argument --data-root
in train_fea.py[L34], train.py[L40], and test.py[L59]:
# train_fea.py, Line 34
parser.add_argument('--data-root', type=str, default='./datasets')
# train.py, Line 40
parser.add_argument('--data-root', type=str, default='./datasets')
# test.py, Line 59
parser.add_argument('--data-root', type=str, default='./datasets')
After finishing above steps, your directory structure of code may like this:
MetaFBP/
|–– data/
|–– dataset/
|–– FBP5500/
|–– FBPSCUT/
|–– US10K/
|–– model/
|–– util/
README.md
test.py
test_fea.py
train.py
train.sh
train_fea.py
train_fea.sh
-
First of all, please train the universal feature extractor for each dataset:
bash train_fea.sh PFBP-SCUT5500 bash train_fea.sh PFBP-SCUT500 bash train_fea.sh PFBP-US10K
Usage of
train_fea.sh
:bash train_fea.sh {arg1=dataset}
dataset
specifies which dataset to train on, available ones are:PFBP-SCUT5500
,PFBP-SCUT500
,PFBP-US10K
-
Once the universal feature extractor is ready, you can run the experiments of PFBP task. For example, the following cmd runs the experiment of
MetaFBP-R
on PFBP-SCUT5500 benchmark with 5-way K-shot regression:bash train.sh MetaFBP-R PFBP-SCUT5500
Usage of
train.sh
:bash train.sh {arg1=model} {arg2=dataset}
model
specifies which model to use, available ones are:Base-MAML
,MetaFBP-R
,MetaFBP-T
dataset
specifies which dataset to train and test on, available ones are:PFBP-SCUT5500
,PFBP-SCUT500
,PFBP-US10K
If you would like to cite our work, the following bibtex code may be helpful:
@inproceedings{lin2023metafbp,
title={MetaFBP: Learning to Learn High-Order Predictor for Personalized Facial Beauty Prediction},
author={Lin, Luojun and Shen, Zhifeng and Yin, Jia-Li and Liu, Qipeng and Yu, Yuanlong and Chen, Weijie},
booktitle={Proceedings of the 31st ACM International Conference on Multimedia},
year={2023},
}
- Our code is built on dragen1860/MAML-Pytorch - https://github.com/dragen1860/MAML-Pytorch
This source code is released under the MIT license. View it here