This model is a ControlNet based on SD-v2.1, trained for image colorization from black and white images.
ColorizeNet is an image colorization model based on ControlNet, trained using the pre-trained Stable Diffusion model version 2.1 proposed by Stability AI.
- Finetuned from model : [https://huggingface.co/stabilityai/stable-diffusion-2-1]
The model has been trained on COCO, using all the images in the dataset and converting them to grayscale to use them to condition the ControlNet. To train the model, you also need a JSON file specifying the input prompt and the source and target images. The file used for training is reported in data/colorization/training/train.json
. Prompts were obtained by randomly choosing one among similar prompts for each image pair generated with Instruct BLIP.
You can download the datasets for training by running the script download_data.sh
bash download_data.sh
Download the weights of the original SD models to init ControlNet
bash download_models.sh
The weights will be placed under the models
folder
Launch the script to create the controlnet weights by specifying the original weights and the path where to save the modified ones
python tool_add_control.py weights/sd/v2-1_768-ema-pruned.pt weights/controlnet/v2-1_colorization.pt
python train.py
For more information on ControlNet, please refer to the original repository