Skip to content

The code of Exploring a Double Task Learning Framework for Makeup Transfer

License

Notifications You must be signed in to change notification settings

Snowfallingplum/DTMT

Repository files navigation

Exploring a Double Task Learning Framework for Makeup Transfer

This is the official pytorch code for "Exploring a Double Task Learning Framework for Makeup Transfer (DTMT)".

The training code and testing code have all been open sourced

The framework of DTMT

Requirements

We recommend that you just use your own pytorch environment; the environment needed to run our model is very simple. If you do so, please ignore the following environment creation.

A suitable conda environment named DTMT can be created and activated with:

conda env create -f environment.yaml
conda activate DTMT

Download MT dataset

  1. MT dataset can be downloaded here BeautyGAN. Extract the downloaded file and place it on top of this folder.
  2. Prepare face parsing. Face parsing is used in this code. In our experiment, face parsing is generated by https://github.com/zllrunning/face-parsing.PyTorch.
  3. Put the results of face parsing in the .\MT-Dataset\seg1\makeup and .\MT-Dataset\seg1\non-makeup

Training code

We have set the default hyperparameters in the options.py file, please modify them yourself if necessary. To train the model, please run the following command directly

python train.py

Inference code

We have set the default hyperparameters in the options.py file, please modify them yourself if necessary. To inference the model, please run the following command directly

python inference.py

The results of DTMT

Acknowledgement

Some of the codes are build upon PSGAN, Face Parsing and aster.Pytorch.

License

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

CC BY-NC-SA 4.0

About

The code of Exploring a Double Task Learning Framework for Makeup Transfer

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages